Filebeat integration as file

Good afternoon.

I tried to setup a disconnected architecture between production and monitoring, as I have non-connectivity between my production servers and my ELK stack (local network).

I planed to work as follow:

  1. filebeat process relevant logs and save it (output plugin) as a file
  2. file is synced with the ELK server
  3. logstash input filebeat logs

as per elastic documentation, such an approach seems possible (Configure the File output | Filebeat Reference [7.13] | Elastic)

However, I failed to configure logstatsh correctly:
using the file input plugin (tail mode) makes the process fine BUT beat logs are seen as simple logs and thus logstash puts the filebeat logs as "message" instead of considering them as it would with the beat input plugin.

Is it possible to prevent this from happening?

Yes, you need to define the JSON codec as your file input in logstash OR use the JSON filter in your logstash configuration and target the message field as the source

Hi AquaX!

indeed, I did not check the format was JSON .. so it shall work as for ModSecurity log integration.

But it seems there is one core difference between both log files:

  • ModSecurity does not contains a "top field" "message" for each JSON entry and works fine
  • Filebeat output contains a "top field" "message" for each JSON entry.

if

  • I put a JSON filter with "message" at source, as for ModSecurity, it tries to parse message fields, which is a simple string
  • I only have JSON codec, it fails with "Can't get text on a START_OBJECT" errors

(having JSON filter without source fails, but that is expected).

I will try to review this but thanks for the initial input

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.