Include Exim main log


how can I, in the current version, include the Exim log and search entries via Kibana ?

About Filebeat I can transfer the log into Kibana, but there the assignment is not correct. So what do I have to set, that I search for a (example) email address in Kibana and get all entries from the main.log (from exim) - that match the email ?

Somehow I don't know what to do. I hope someone can help me.

Thanks a lot

Welcome to our community! :smiley:

Just so I am understanding correctly - you are sending the exim logs via Filebeat?

What does this mean sorry?

Thank you very much for the nice welcome to the community.

So with Filebeat I import the main.log (from exim) into ELA/Kibana.

This is what my filebeat.yml looks like


# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
    - /var/log/exim/main.log
    #- c:\programdata\elasticsearch\logs\*

In Kibana it looks like this

@timestamp:Aug 24, 2021 @ 10:01:00.011 agent.ephemeral_id:c0dff231-f7d3-4c54-b7f6-ed0642d02ccb 
agent.type:filebeat agent.version:7.14.0 ecs.version:1.10.0 host.architecture:x86_64 
host.containerized:false host.hostname:centos7-ela, fe80::250:56ff:feb0:1b48 host.mac:00:50:56:b0:1b:48 
host.os.codename:Core host.os.kernel:3.10.0-1160.36.2.el7.x86_64 Linux Linux host.os.platform:centos host.os.type:linux 
host.os.version:7 (Core) input.type:log log.file.path:/var/log/exim/main.log log.offset:1,512,705 
message:2021-08-22 10:02:10 1mHiQk-00087D-1A <=

All this is under the tab (column) document.

But I would like to build it in a way that I have a column where the recipient is in it, in this
example, and I can search for it via the search field.

Example: if I search for, I would like to see all entries that have as recipient.

And I do not know how to do that ? Probably a template is missing here or something.

Can you help me there ? I have already looked at the following link but I do not get anywhere.

Right that's a bit clearer :slight_smile:

You will need to parse the events so they are split up in Elasticsearch. Try something like GitHub - sboschert/filebeat-module-exim4: A Filebeat module that parses log files created by Exim 4.</

2021-08-24T12:48:41.226+0200    ERROR   instance/beat.go:989    Exiting: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (
Exiting: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (

the error occurs when restarting the filebeat. After adding and activating the exim4 module.

I just have no idea what the problem is.

Did you check to see if there's already a Beat process running?

I have set up the whole server again, module is now active and the error does not appear anymore. There seems to have been a misconfiguration somewhere.

How can I create a field in Kibana (Discover), where the email address from the exim log is in it ?

EDIT: It looks exactly the same in Kibana as it did before the exim4 module was activated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.