How to use the fields collection in prospector with filter expression in logstash

Hello,

I'm running 6.1.2.

Given this prospector:

- type: log
  enabled: true
  multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
  multiline.negate: true
  multiline.match: after
  ignore_older: 24h
  fields:
    environment: iTest
    log_type: serilog-pipe-v1
  paths:
    - \\itstintfil01\IWMLogs\TxSender\machine01_*.txt
    - \\itstintfil01\IWMLogs\TxSender\macine02_*.txt

and this conditional block in the filer section of the logstash.conf:

if [fields][log_type]in["serilog-v2", "serilog-v3", "serilog-www", "serilog-v5", "serilog-http", "serilog-pipe-v1"]{
         run some code here
}

Execution never drops into the if block and I'm wonder if I'm using the in clause wrong.

Thank you,
Stephen

This looks correct. What does an actual message produced by Logstash look like? Use a stdout { codec => rubydebug } output to dump the raw event.

Here are a few samples that I pasted in. Is there a better way of inserting text like this so it is formatted.

{
"CorrelationId" => "0126ecafab344aff92199e4b20e05f23",
"TimeZoneOffset" => "-08:00",
"Server" => "LIDT20EXTFLD01L",
"Application" => "ExternalFields",
"Severity" => "Warning",
"offset" => 119382880,
"fields" => {
"log_type" => "serilog-pipe-v1",
"environment" => "iTest"
},
"ThreadId" => "36",
"year" => "2018",
"host" => "LIDT20LOGST01",
"month" => "02",
"message" => "RecordToAWDFacadeRequest: Record without BusinessArea/WorkType: "<record objectid=\"2018-02-08-14.28.47.964420O01\" ba=\"NBC\">PLISTT2_New_Business3814518PLRouterINS-0497, SIDNEYVF550289901<Insured ID=\"63bf15b7-dfd3-4a8d-b82a-1e24804390e2\">SIDNEYINS-0497ELECTRONIC"",
"day" => "10",
"EventDate" => 2018-02-10T15:51:21.963Z,
"source" => "\\itstintfil01\IWMLogs\ExternalFields\ExternalFields_LIDT20EXTFLD01L_20180210_001.txt",
"@version" => "1",
"prospector" => {
"type" => "log"
},
"ProcessId" => "5352",
"time" => "07:51:21.963",
"@timestamp" => 2018-02-11T01:12:59.893Z,
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"beat" => {
"hostname" => "LIDT20LOGST01",
"version" => "6.1.2",
"name" => "LIDT20LOGST01"
}
}
{
"CorrelationId" => "0126ecafab344aff92199e4b20e05f23",
"TimeZoneOffset" => "-08:00",
"Server" => "LIDT20EXTFLD01L",
"Application" => "ExternalFields",
"Severity" => "Warning",
"offset" => 119383533,
"fields" => {
"log_type" => "serilog-pipe-v1",
"environment" => "iTest"
},
"ThreadId" => "36",
"year" => "2018",
"host" => "LIDT20LOGST01",
"month" => "02",
"message" => "RecordToAWDFacadeRequest: Record without BusinessArea/WorkType: "<record objectid=\"2018-02-08-14.28.48.549440O01\" ba=\"NBC\">PLOOST2_New_Business3814519PLRouterINS-0497, SIDNEYVF550289902<Insured ID=\"63bf15b7-dfd3-4a8d-b82a-1e24804390e2\">SIDNEYINS-0497ELECTRONIC"",
"day" => "10",
"EventDate" => 2018-02-10T15:51:21.963Z,
"prospector" => {
"type" => "log"
},
"@version" => "1",
"source" => "\\itstintfil01\IWMLogs\ExternalFields\ExternalFields_LIDT20EXTFLD01L_20180210_001.txt",
"ProcessId" => "5352",
"time" => "07:51:21.963",
"@timestamp" => 2018-02-11T01:12:59.893Z,
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"beat" => {
"hostname" => "LIDT20LOGST01",
"version" => "6.1.2",
"name" => "LIDT20LOGST01"
}
}
{
"CorrelationId" => "0126ecafab344aff92199e4b20e05f23",
"TimeZoneOffset" => "-08:00",
"Category" => "PacificLife.Services.ExternalFields.Service.ExternalFieldService",
"Server" => "LIDT20EXTFLD01L",
"Application" => "ExternalFields",
"Severity" => "Info",
"offset" => 119383911,
"fields" => {
"log_type" => "serilog-pipe-v1",
"environment" => "iTest"
},
"ThreadId" => "36",
"year" => "2018",
"host" => "LIDT20LOGST01",
"month" => "02",
"message" => "EF Response "7f57342a93fa45bb8b5b715e553523b2" "\\itstintfil01\IWMLogs\ExternalFields\Payloads\20180210\7\7f57342a93fa45bb8b5b715e553523b2-ef-2018-02-10-07-51-21-963-1318b9ff2fc044e5911289b68f8596fa.xml"",
"day" => "10",
"EventDate" => 2018-02-10T15:51:21.963Z,
"prospector" => {
"type" => "log"
},
"@version" => "1",
"source" => "\\itstintfil01\IWMLogs\ExternalFields\ExternalFields_LIDT20EXTFLD01L_20180210_001.txt",
"ProcessId" => "5352",
"time" => "07:51:21.963",
"@timestamp" => 2018-02-11T01:12:59.893Z,
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"beat" => {
"hostname" => "LIDT20LOGST01",
"version" => "6.1.2",
"name" => "LIDT20LOGST01"
}
}
{
"CorrelationId" => "0126ecafab344aff92199e4b20e05f23",
"TimeZoneOffset" => "-08:00",
"Category" => "PacificLife.Services.ExternalFields.Service.ExternalFieldService",
"Server" => "LIDT20EXTFLD01L",
"Application" => "ExternalFields",
"Severity" => "Info",
"offset" => 119384190,
"fields" => {
"log_type" => "serilog-pipe-v1",
"environment" => "iTest"
},
"ThreadId" => "36",
"year" => "2018",
"host" => "LIDT20LOGST01",
"month" => "02",
"message" => "Completed operation bb130606-8212-4f8d-b9e9-6599a71dd2f3: "Query operation" in 00:00:00.1741198 (174 ms)",
"day" => "10",
[2018-02-10T17:13:52,848][FATAL][logstash.runner ] SIGINT received. Terminating immediately..
"EventDate" => 2018-02-10T15:51:21.963Z,
"prospector" => {
"type" => "log"
},
"@version" => "1",
"source" => "\\itstintfil01\IWMLogs\ExternalFields\ExternalFields_LIDT20EXTFLD01L_20180210_001.txt",
"ProcessId" => "5352",
"time" => "07:51:21.963",
"@timestamp" => 2018-02-11T01:12:59.893Z,
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"beat" => {
"hostname" => "LIDT20LOGST01",
"version" => "6.1.2",
"name" => "LIDT20LOGST01"
}
}

Is there a way to export/report the LS pipeline with metrics from kibana?

Is there a way to export/report the LS pipeline with metrics from kibana?

Not sure exactly what you mean, but recent LS has a monitoring API and can self-report metrics to ES.

Given what your events look like the configuration you posted should be okay. Please show your full configuration.

@magnusbaeck Sorry for not getting back to you sooner and thank you for your help on this matter. Before I log back into the boxes to grab the files I was wondering if there is a way to debug this problem out of the environment, or put debugging statements in the conditionals and test it that way. Everything I read online suggest that what I am trying to do should work, as you mentioned.

Thank you,
Stephen

Before I log back into the boxes to grab the files I was wondering if there is a way to debug this problem out of the environment, or put debugging statements in the conditionals and test it that way.

Not really. I'd simplify the configuration until I either get to a simple reproducible example or until it starts working, and in the latter case the problem should be narrowed down.

@magnusbaeck Magnus, again thank you for your assistance and please accept my apology of not following up sooner, as I was building our our production ES environment.

I have PipeLine Management running on my 6.2.2 Cluster and that makes changing the pipeline such a breeze. :slight_smile:

Anyways, what ended up working was having the filebeat prospector fields property I was looking for e.g.

 fields: 
   log_type: 'serilog' 

explicitly set as a string, then in the pipeline filter I looked for the same value and type (string) e.g.

if [fields][log_type] == "serilog" {
}

I also believe that having the yml file formatted and validated (spaces matter) might have contributed to fixing the problem.

Regards from the States,
Stephen

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.