I have a problem when I want to send logs of clamav-0.104.2.linux.x86_64 to EK version 7.14.2
I did configure PFSense to send logsto EK but I did not find the best procedure to configure Elasticsearch and Kibana (7.14.2)
Could you help me please ; it is urgent ! thanks
Hi @khouloud1 I need a few more details to better understand what you're trying to do.
If I understood correctly, you did manage to send logs from your clamav to Elasticsearch, right?
However you're looking for a better way to setup Elasticsearch and Kibana, is that correct?
How are you currently sending those logs to ES? Are you using Filebeat?
thanks indeed for your suply.
We are using Filebeat to send logs of Clamav to Elasticsearch.
we did use the two links below to do the configiguration:
But it is not working
ClamAV + Filebeat. Free and open source centralised AV… | by Gerard Arall | Medium
@khouloud1 I still need some more details about the problems you are facing and the versions you are running:
- Is Elasticsearch up and running?
- Did you manage to access Kibana?
- Which version of the Elastic Stack (Elasticsearch, Kibana, Filebeat, etc) are you running?
- Did you manage to get Filebeat connecting to Elasticsearch? Try
filebeat test output
command. - Do you see any logs on Kibana?
- Do you see any error logs from Filebeat?
Before diving into this 3rd party module you linked, I suggest trying to ingest the raw logs from clamav, so we can make sure your stack is working fine. You can use the filestream input from Filebeat to ship the logs to Elasticsearch. As a fist try don't worry about parsing them, just focus on getting them on Elasticsearch and visualising them on Kibana.
Hello , thanks for your response.
1/ 2/ yes Elasticsearch and Kibana are up and running so we can access kibana
3/ we use 7.14.2 and the Elastic Stack (Elasticsearch, Kibana, Filebeat) are running
4/ of course , we test output of the connection and it's ok
5/ we did not see any logs in kibana
6/ yes , we did see this error's messsages for filebeat/ clamav
It seems you placed some of the files in a place different from the one Filebeat expects.
I've just tested it and had no problems:
2022-03-21T18:01:27.081+0100 INFO instance/beat.go:309 Setup Beat: filebeat; Version: 7.14.2
2022-03-21T18:01:27.081+0100 INFO [index-management] idxmgmt/std.go:184 Set output.elasticsearch.index to 'filebeat-7.14.2' as ILM is enabled.
2022-03-21T18:01:27.081+0100 INFO [esclientleg] eslegclient/connection.go:100 elasticsearch url: http://localhost:9200
2022-03-21T18:01:27.081+0100 INFO [publisher] pipeline/module.go:113 Beat name: x-wing
2022-03-21T18:01:27.081+0100 INFO beater/filebeat.go:117 Enabled modules/filesets: (), clamav (log)
2022-03-21T18:01:27.082+0100 INFO [monitoring] log/log.go:118 Starting metrics logging every 30s
2022-03-21T18:01:27.082+0100 INFO instance/beat.go:473 filebeat start running.
2022-03-21T18:01:27.088+0100 INFO memlog/store.go:119 Loading data file of '/home/tiago/sandbox/CommunityRotation/clamav/filebeat-7.14.2-linux-x86_64/data/registry/filebeat' succeeded. Active transaction id=0
2022-03-21T18:01:27.088+0100 INFO memlog/store.go:124 Finished loading transaction log file for '/home/tiago/sandbox/CommunityRotation/clamav/filebeat-7.14.2-linux-x86_64/data/registry/filebeat'. Active transaction id=0
2022-03-21T18:01:27.088+0100 INFO [registrar] registrar/registrar.go:109 States Loaded from registrar: 0
2022-03-21T18:01:27.088+0100 INFO [crawler] beater/crawler.go:71 Loading Inputs: 1
2022-03-21T18:01:27.088+0100 INFO [input] log/input.go:164 Configured paths: [/tmp/clamd.log*] {"input_id": "c3a5bf17-2b92-4f1d-b85f-77e29dd9112a"}
2022-03-21T18:01:27.088+0100 INFO [crawler] beater/crawler.go:141 Starting input (ID: 6901489908251514153)
2022-03-21T18:01:27.089+0100 INFO [input] log/input.go:164 Configured paths: [/tmp/clamd.log*] {"input_id": "c56b17aa-ca6f-4e5b-a57b-ba03b0c16f02"}
2022-03-21T18:01:27.089+0100 INFO [crawler] beater/crawler.go:108 Loading and starting Inputs completed. Enabled inputs: 1
2022-03-21T18:01:27.089+0100 INFO cfgfile/reload.go:164 Config reloader started
2022-03-21T18:01:27.089+0100 INFO [input] log/input.go:164 Configured paths: [/tmp/clamd.log*] {"input_id": "1f8c7712-0741-4ad7-9047-d90a9a8d0bd2"}
2022-03-21T18:01:27.090+0100 INFO [esclientleg] eslegclient/connection.go:100 elasticsearch url: http://localhost:9200
Try this:
- Create a temporary folder,
cd
into it - Clone the module repo:
git clone https://github.com/arall/filebeat-clamav.git
- Download Filebeat:
wget https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.14.2-linux-x86_64.tar.gz
(or another version. I recommend updating to a supported one) - Extract Filebeat
tar -xf ilebeat-7.14.2-linux-x86_64.tar.gz
- Copy the module into Filebeat's folder:
cp -rav filebeat-clamav/* filebeat-7.14.2-linux-x86_64
cd
into Filebeat's folder:cd filebeat-7.14.2-linux-x86_64
- Update the
filebeat.example.yml
(from the module) to match your output and files you want to ingest. - Run Filebeat:
./filebeat -c filebeat.example.yml -e -v
(the-e -v
will make Filebeat be a bit more verbose and log to stderr).
That worked quite well for me, and I believe it will work for you as well.
Hi,
I did try what you did say but I have this error message in kibana:
error.message: Provided Grok expressions do not match field value: [/path/eicar.com.txt: Win.Test.EICAR_HDB-1 FOUND]
N.B.: Test config and output of filebeat = ok
Could you help please ?
Hi,
I did try what you did say but I have this error message in kibana:
error.message: Provided Grok expressions do not match field value: [/path/eicar.com.txt: Win.Test.EICAR_HDB-1 FOUND]
N.B.: Test config and output of filebeat = ok
Could you help please ?
I see that the message field Can not be parsed into two fields: clamav.virus and file.path
It seems the module is working fine, but the logs are in a different format than the one expected by the ingest pipeline.
This ClamAV module is not developed by Elastic, so I have very little knowledge about it.
My suggestion is to double check you're ingesting the correct log files and make sure they're in the correct format.
If you can't get the logs to be ingested, then creating another topic focusing in the Grok pattern/log format is the best way to go.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.