Elasticsearch & Kibana Audit logs

Hi Team,

Elasticsearch audit logs is taking 20 gb size everyday. Can you please help me to minimize the size of audit log file.

Regards,
Syed

Hi Team,

We are using elk & Kibana 7.16.2 version and same version for filebeat also. we have enabled the audit logs by enabling audit keys in elasticsearch.yml, But it is taking 20gb size per day. Requesting you to please help in minimizing the size of audit logs files.

Regards,
Syed

Hi Elastic Support Team,

Can we get any updates on the same.

Regards,
Syed

Hi @syed0510

Yes auditing can be quite extensive please refer to these 2 document pages

Configuring Auditing

Event Types

And you can decide which events types you want to include and / or exclude in your audit logs.

using

xpack.security.audit.logfile.events.include

and / or

xpack.security.audit.logfile.events.exclude

Or you can completely disable it if you do not need it.

Hi @stephenb ,

Thanks for the reply!

I will check the document that you have shared and will update you accordingly.

Regards,
Syed

Hi @stephenb,

Can you please resolve below queries-

  1. Is there any way to set the log entry format of audit events attribute?
  2. How can we use xpack.security.audit.logfile.events.include if we wants to get the list of all users who logged successfully.

Thanks!

Regards,
Syed

There may be but it is a highly structure json log following ECS. It is very easy to read and there is even a module to load it into elasticsearc (see below). You would need to read / adjust the logging output in the config/log4j2.properties file which is not my area of expertise. If you change it then

quick look at the docs look like it would look like

xpack.security.audit.logfile.events.include: ["authentication_success"]

That would log each successful login.

Then you would read those logs into elasticsearch and visualize the results using...

Perhaps take a look at

Hi @stephenb,

Thanks for the update!

We are already using filebeat that's why audit logs size reached to 20 gb. Please find below the filebeat input file setting from filebeat.yml-

# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    - D:\Elastic\logs\elasticsearch_audit-*.json

#-------------------------------------------

Requesting you to please help in minimize the size of audit log file through filebeat.

Regards,
Syed

Hi @stephenb ,

we have installed filebeat on the same server where elk installed and given the path of elk audit log file in filebeat.yml directly.

Regards,
Syed

I given you direction To only log login event's.

Filebeat has nothing to do with the size of the logs.

It is unclear what you want.

Exactly what is 20GB the audit logfile?

The index of the audit logs in elasticsearch?

Everything you need is in the docs that I have provided.

Settings for how long to retain.. plus I showed you how to only log the login event's... Best I can do...

Read the docs adjust.. test... adjust.

Hi @stephenb,

Thanks for the support!

Actually I only wants to know here about the path of log that provided in filebet.yml is correct or not as we have given the path of elk audit log file in filebeat.yml directly.

Blockquote

Regards,
Syed

Depends on where you installed elasticsearch, did you look are the logs there?

There is a filebeat module that knows how to parse the audit logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.