Grok with custom pattern works in debugger but not in pipline

Hi,

I´m have a lot of problems to get a dataset in elastic.

In Debugger it works.

Log

<30>2023:12:08-12:59:39 fw-swr-2 ulogd[32373]:

grock

.*>%{SOPHOS_TIMESTAMP:_tmp.timestamp} %{TEST:firewall.name}

custom pattern

SOPHOS_TIMESTAMP (?:%{YEAR}:%{MONTHNUM}:%{MONTHDAY}-%{HOUR}:%{MINUTE}:%{SECOND})
TEST [a-zA-Z0-9._-]+

output

{
  "_tmp": {
    "timestamp": "2023:12:08-12:59:39"
  },
  "firewall": {
    "name": "fw-swr-2"
  }
}

In elk Stack gui:

pattern

.*>%{SOPHOS_TIMESTAMP:_tmp.timestamp} %{TEST:firewall.name}

custom pattern:

{
  "SOPHOS_TIMESTAMP": "(?:%{YEAR}:%{MONTHNUM}:%{MONTHDAY}-%{HOUR}:%{MINUTE}:%{SECOND})",
  "TEST": "[a-zA-Z0-9._-]+"
}

No entry in Database.
There is no other processor in the pipline.

Is there something wrong?
Is there a log for the pipline where I can see what´s going wrong?

Hi @helldunkel

Assuming you mean the Grok Debugger in Kibana Dev Tols

Should just be this

You will have to provide more details

I used this log file

<30>2023:12:08-12:59:39 fw-swr-2 ulogd[32373]:
<30>2023:12:08-12:59:40 fw-swr-3 ulogd[32380]:
<30>2023:12:08-12:59:42 fw-abc-2 ulogd[32390]:
<30>2023:12:08-12:59:45 fw-xyz-2 ulogd[32300]:
<30>2023:12:08-12:59:57 fw-nnn-2 ulogd[32388]:

This logstash conf

input {
	file {
		path => "/Users/sbrown/workspace/sample-data/discuss/discuss-sophos.log"
		start_position => "beginning"
		sincedb_path => "/dev/null"
	}
}

filter {
	grok {
		match => { "message" => ".*>%{SOPHOS_TIMESTAMP:_tmp.timestamp} %{TEST:firewall.name}"}
		pattern_definitions => {
				"SOPHOS_TIMESTAMP" => "(?:%{YEAR}:%{MONTHNUM}:%{MONTHDAY}-%{HOUR}:%{MINUTE}:%{SECOND})"
				"TEST" => "[a-zA-Z0-9._-]+"
		}
	}
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
  }
	stdout{}
}

And the data loaded fine...

Output from logstash

{
        "@timestamp" => 2023-12-09T16:09:24.078651Z,
             "event" => {
        "original" => "<30>2023:12:08-12:59:42 fw-abc-2 ulogd[32390]:"
    },
              "host" => {
        "name" => "hyperion"
    },
               "log" => {
        "file" => {
            "path" => "/Users/sbrown/workspace/sample-data/discuss/discuss-sophos.log"
        }
    },
    "_tmp.timestamp" => "2023:12:08-12:59:42",
          "@version" => "1",
           "message" => "<30>2023:12:08-12:59:42 fw-abc-2 ulogd[32390]:",
     "firewall.name" => "fw-abc-2"
}
....

In Elastic

GET logs-*/_search

{
  "took": 1,
  "timed_out": false,
  "_shards": {
    "total": 2,
    "successful": 2,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": {
      "value": 6,
      "relation": "eq"
    },
    "max_score": 1,
    "hits": [
      {
        "_index": ".ds-logs-generic-default-2023.12.09-000001",
        "_id": "8rRYT4wBpwIAo0SDQYd0",
        "_score": 1,
        "_ignored": [
          "_tmp.timestamp"
        ],
        "_source": {
          "@timestamp": "2023-12-09T16:10:33.056416Z",
          "log": {
            "file": {
              "path": "/Users/sbrown/workspace/sample-data/discuss/discuss-sophos.log"
            }
          },
          "firewall.name": "fw-swr-2",
          "data_stream": {
            "namespace": "default",
            "type": "logs",
            "dataset": "generic"
          },
          "host": {
            "name": "hyperion"
          },
          "@version": "1",
          "_tmp.timestamp": "2023:12:08-12:59:39",
          "event": {
            "original": "<30>2023:12:08-12:59:39 fw-swr-2 ulogd[32373]:"
          },
          "message": "<30>2023:12:08-12:59:39 fw-swr-2 ulogd[32373]:"
        }
      },
      {
        "_index": ".ds-logs-generic-default-2023.12.09-000001",
        "_id": "8bRYT4wBpwIAo0SDQYd0",
        "_score": 1,
        "_ignored": [
          "_tmp.timestamp"
        ],
        "_source": {
          "@timestamp": "2023-12-09T16:10:33.057941Z",
          "log": {
            "file": {
              "path": "/Users/sbrown/workspace/sample-data/discuss/discuss-sophos.log"
            }
          },
          "firewall.name": "fw-nnn-2",
          "data_stream": {
            "namespace": "default",
            "type": "logs",
            "dataset": "generic"
          },
          "host": {
            "name": "hyperion"
          },
          "@version": "1",
          "_tmp.timestamp": "2023:12:08-12:59:57",
          "event": {
            "original": "<30>2023:12:08-12:59:57 fw-nnn-2 ulogd[32388]:"
          },
          "message": "<30>2023:12:08-12:59:57 fw-nnn-2 ulogd[32388]:"
        }
      },
....

Hi,

OK.
The setup ist a security Onioin distributed setup. Firewall sends syslog to receiver node, this sends it to manager node.

All logs from firewall are send to a custom pipline.

The pipline config in elastic-management-ingest piplines:

[
  {
    "grok": {
      "field": "message",
      "patterns": [
        ".*>%{SOPHOS_TIMESTAMP:_tmp.timestamp} %{TEST:firewall.name}"
      ],
      "pattern_definitions": {
        "SOPHOS_TIMESTAMP": "(?:%{YEAR}:%{MONTHNUM}:%{MONTHDAY}-%{HOUR}:%{MINUTE}:%{SECOND})",
        "TEST": "[a-zA-Z0-9._-]+"
      }
    }
  }
]

All tests in grock debugger worked.

My question is: is there a way to see what happens in the pipline? Why the pipline not worked.
I have no way to start the debugging in the running system.

hi,

OK. it isn´t grok or custom pattern, it is the database entry:

%{TEST:firewall.name} -> no
%{TEST:test.test} -> no
%{TEST:test} -> no
%{TEST:host.name} -> yes
%{TEST:host.hostname} -> yes

But why? Why I can not set a new datafild?

And found.

A missconfig deep inside the Index management.

Glad you got it working...

Just for next time... you can use verbose to see more details

Query parameters

verbose
(Optional, Boolean) If true, the response includes output data for each processor in the executed pipeline.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.