Reading multiple files and writing into multiple files

I want to read multiple log files and write into multiple files (intention is to test). My Filebeat configuration looks like below :

filebeat.prospectors:

- input_type: log
  enabled: true
  paths:
    - C:\data\Logs\eis.log
  fields: {log_type: eis}


- input_type: log
  enabled: true
  paths:
    - C:\data\Logs\sa.log
  fields: {log_type: sa}
  
  multiline.match: after
  multiline.pattern: '^[[:space:]]+(at|\.{3})\b|^Caused by:'
  multiline.negate: false
  multiline.match: after

output.file:
  path: "c:/var/"
  filename: "filebeat.log"

Hi @Raghuveer_SJ,

Beats can only have an output, and the file output can only have a file.

But I see filebeat has written both the files into a single file. On console it only says harvesting with one of the filename. So I was a bit confused. However I see both the files are read so its fine.

but I don't see the stacktrace being grouped by filebeat, its sent as individual lines. So I tried adding the codec in logstash as :

input {

beats {
	port=>5044
	 codec => multiline {
                  pattern => "^%{TIMESTAMP_ISO8601} "
                  negate => true
                  what => previous
               }
}
}

I see logstash says filebeat does not support this configuration. Please suggest.

You should add the multiline configuration in filebeat itself, check the documentation for filebeat, it is very similar to the one in logstash: https://www.elastic.co/guide/en/beats/filebeat/6.3/multiline-examples.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.