Filebeat - Dissect Message String

Hi,

I am looking for advise on how to use the processor-> dissect within Filebeat for a log file. Below is an example of the log file date:

[08/10/2020 09:31:57]	   	servername - Processor Queue	Ok	3	WMI (localhost:ProcessorQueueLength)	4890
[08/10/2020 09:32:25]	   	servername - HTTP Connections Spiking	Bad	5.00	Perf Counter test (Current Connections)	4828
[08/10/2020 09:32:30]	   	servername - HTTP Connections Spiking	Bad	8.00	Perf Counter test (Current Connections)	4871
[08/10/2020 09:32:38]	   	servername - HTTP Connections Spiking	Bad	13.00	Perf Counter test (Current Connections)	4926
[08/10/2020 09:32:43]	   	servername - Processor Queue	Bad	7	WMI (localhost:ProcessorQueueLength)	3473
[08/10/2020 09:32:57]	   	servername - Processor Queue	Bad	8	WMI (localhost:ProcessorQueueLength)	4893
[08/10/2020 09:32:57]	   	servername - Processor Queue	Bad	37	WMI (localhost:ProcessorQueueLength)	4902
[08/10/2020 09:33:00]	   	servername - HTTP Connections Spiking	Ok	7.00	Perf Counter test (Current Connections)	4828
[08/10/2020 09:33:02]	   	servername - HTTP Connections Spiking	Ok	9.00	Perf Counter test (Current Connections)	4871
[08/10/2020 09:33:11]	   	servername - HTTP Connections Spiking	Ok	18.00	Perf Counter test (Current Connections)	4926
[08/10/2020 09:33:53]	   	servername - Processor Percentage	Bad	100 %	CPU Usage	4881
[08/10/2020 09:33:59]	   	servername - Processor Queue	Ok	1	WMI (localhost:ProcessorQueueLength)	4902
[08/10/2020 09:33:59]	   	servername - Processor Queue	Ok	4	WMI (localhost:ProcessorQueueLength)	4893

I want to update take the message and update the fields, to timestamp, hostname, test, status, reply, testmethod. This is what I currently have in my filebeat.yml file:

- type: log
  enabled: true
  paths:
    - C:\ProgramData\Monitor\Logs\*.txt 
  processors:
    - dissect:
        tokenizer: '%{timestamp|integer} %{hostname} - %{test} %{status} %{reply} %{testmethod}'
        field: "message"

If possible I would also like it to drop the events which are 'OK', and only collect 'Bad' logs.

thanks in advance

Ian

Hi Ian,

First, check your dissect pattern using tools like this.

If possible I would also like it to drop the events which are 'OK', and only collect 'Bad' logs.

You can use drop_event processor for that.

Thanks for pointing me in the direction of the debugger @borna_talebi . I have tried splitting my logs down, however when I put the Processor Queue data into it, it comes back with "error: empty string provided".

Really struggling to dissect it properly, I can get the timestamp, servername dissected, after that I get various results depending on which line of data it is reading.

With regards to dropping the OK data I did manage to do that, but my manager wants to keep it all to get better statistics.

I am now seeing if I can make any headway using Logstash/Grok as an option, but again I am new to that, so it's very slow progress with that debugger also. Ideal world I would prefer just to use Filebeat, but am open to any suggestion. Any other help or suggestions would be really appreciated.

thanks

Ian

You could use an ingest pipeline and define several dissect processor in it. using ingest pipeline moves the dissect process to elasticsearch rather than filebeat and it's easier to debug your problems with simulate API.

Take look at this. it might help.

If you could send me your dissect pattern I might be able to find the problem.

thanks

Borna

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.