Logstash on docker does not work with auditbeat

Hi, I am currently trying to configure a docker with Logstash by sending there logs with auditbeat installed on my pc. The problem is that I constantly receive the same error when I do .\auditbeat.exe setup -e
Exiting: index management requested but the Elasticsearch output is not configured/enabled

I really don't know what to do.

This is my auditbeat.yml

###################### Auditbeat Configuration Example #########################

# This is an example configuration file highlighting only the most common
# options. The auditbeat.reference.yml file from the same directory contains all
# the supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/auditbeat/index.html

# =========================== Modules configuration ============================
auditbeat.modules:

- module: file_integrity
  paths:
  - C:/windows
  - C:/windows/system32
  - C:/Program Files
  - C:/Program Files (x86)



- module: system
  datasets:
    - host    # General host information, e.g. uptime, IPs
    - process # Started and stopped processes

  # How often datasets send state updates with the
  # current state of the system (e.g. all currently
  # running processes, all open sockets).
  state.period: 12h

# ======================= Elasticsearch template setting =======================
#setup.template.settings:
#  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging

# ================================= Dashboards =================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here or by using the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:

# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  host: "kibana:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:

# =============================== Elastic Cloud ================================

# These settings simplify using Auditbeat with the Elastic Cloud (https://cloud.elastic.co/).

# The cloud.id setting overwrites the `output.elasticsearch.hosts` and
# `setup.kibana.host` options.
# You can find the `cloud.id` in the Elastic Cloud web UI.
#cloud.id:

# The cloud.auth setting overwrites the `output.elasticsearch.username` and
# `output.elasticsearch.password` settings. The format is `<user>:<pass>`.
#cloud.auth:

# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------
#output.elasticsearch:
#enabled: false
#  # Array of hosts to connect to.
#  hosts: ["https://elasticsearch:9200"]
#

#  # Protocol - either `http` (default) or `https`.
#  #protocol: "https"
#
#  # Authentication credentials - either API key or username/password.
#  #api_key: "id:api_key"
#  username: "xxx"
#  password: "${ES_PWD}"
#  ssl:
#    enabled: true
#    certificate_authorities: ['xxx/new_ca.pem']
#    certificate: "xxx/es01.pem"
#    key: "xxx/es01.key"
  
# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["logstash:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  ssl.certificate_authorities: ['xxx/new_ca.pem']

  # Certificate for SSL client authentication
  ssl.certificate: "xxx/es01.pem"

  # Client Certificate Key
  ssl.key: "xxx/es01.key"

# ================================= Processors =================================

# Configure processors to enhance or manipulate events generated by the beat.

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~

and this is the pipeline

input {
beats {
port => "5044"
host => "0.0.0.0"
}
}
output {
elasticsearch {
hosts => ["https://elasticsearch:9200"]
user => "xxx"
password => "xxx"
index => "auditbeat-"
}
file {
path => "/xxx/output.log"
codec => line { format => "custom format: %{message}" }
}
}

I thought I found the problem which was the absence of the certificate in the pipeline,
now that i insert it it gives me this error

WARN ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Certificate for <xxx> doesn't match any of the subject alternative names: [localhost, 127.0.0.1, es01]", :exception=>Manticore::UnknownException, :cause=>#<Java::JavaxNetSsl::SSLPeerUnverifiedException: Certificate for <xxx> doesn't match any of the subject alternative names: [localhost, 127.0.0.1, es01]>

and I have tried with both new_ca.pem and es01.pem.

Now I don't know what certificate it wants. Can you help me?

1st As the error say ... your auditbeat.yml will need to have the output section configured to point directly to elasticsearch not logstash in order to run setup

That is because auditbeat needs to load assets directly to elasticsearch and can not do that when pointed at logstash.

After you run setup once succesfully THEN change back to point to logstash.

However I always recommend that a new user configure and run the simplest architecture first.

Auditbeat -> Elasticsearch

Get that running make sure everything works and then introduce logstsash,

This architecture is fine but more complex and I can see by your configs that it will not work as expecte... especially as you are running logstash in docker.

Auditbeat -> Logstash -> Elasticsearch

Your pipleline will need to look some like this (of course you will need to add your specifics) even though that say filebeat it is the same for auditbeat

2nd The certs...

most likely your logstash is not running with SSL unless you specifically set that up... so the certs are probably not the correct ones since they seem to reference the elasticsearch certs ..perhaps you set up SSL for logstash but you did not share any of that... SSL is not enabled by default logstash

It looks like you are trying to use the elasticsearch certs on logstash output in auditbeat... that wont work... then in the logstash output section of the logstash pipeline you will need the cert for elasticsearch so you will need them and be mounted / available. So you will need to do that with your docker configurations..

So... You have many misconfigurations at this point..

I would highly encourage you to get auditbeat working directly with elasticsearch and then come back once that is working and we can work on getting logstash to work... which I would not run in docker if this is your first time ..

Hi, thanks for your answer.
I already have configured Auditbeat -> Elasticsearch, and that works fine.
The problem with the certs is that the only ones I could find are one the elasticsearch dockers, I have no docker in the logstash one, so I don't know which one I should use for the pipeline

Apologies, I don't know what that means.

Are you trying to set up SSL between every connection?

sorry I meant that I have no certificates in the logstash docker.
yes I am trying to set up SSL for every connection because I have the elasticsearch host working with https

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.