Unexpected _dateparsefailure with ISO8601 timestamp from winlogs

Hi I'm using winlogbeat and receive nice logs with a nice ISO8601 @timestamp in the received events (I used a logstash conf stdout to check them.)

"@timestamp" => 2023-06-15T14:44:20.273Z,
which I think logstash automatically takes as the timestamp?

Because theres also an event line:

"created" => "2023-06-15T14:44:21.303Z",

Here's what arrives and I see in Kibana:

"tags": [
    "winlogbeat",
    "beats_input_codec_plain_applied",
    "_grokparsefailure",
    "_dateparsefailure"
  ],

Here's my logstash conf for this:

input {
  beats {
    host => "0.0.0.0"
    port => 5046
    tags => "winlogbeat"
  }
}


filter {

  if [winlog][event_id] == "4625" or [winlog][event_id] == "4723" or [winlog][event_id] == "4724" or [winlog][channel] == "Microsoft-Windows-WMI-Activity/Operational" or [winlog][channel] == "Microsoft-Windows-PowerShell/Operational" {
    # Add any transformations here
  } else {
    drop { }
  }

}


output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "winlog-%{+YYYY.MM.dd}"
  }
}

Here's what the event logs look like with the original dates:

{
      "@version" => "1",
        "winlog" => {
                  "api" => "wineventlog",
             "keywords" => [
            [0] "Audit Failure"
        ],
              "channel" => "Security",
               "opcode" => "Info",
            "record_id" => 437346,
              "process" => {
            "thread" => {
                "id" => 6353
            },
               "pid" => 4
        },
        "provider_guid" => "{4646464-5478-4994-a5ba-6546465465446ae}",
           "event_data" => {
                "PrivilegeList" => "SeProfileSingleProcessPrivilege",
                 "ObjectServer" => "Security",
                      "Service" => "-",
               "SubjectUserSid" => "S-1-5-21-2362662624-123135234-2456245626-2310",
            "SubjectDomainName" => "SUPERDOO",
                    "ProcessId" => "0x15d0",
                  "ProcessName" => "C:\\Program Files\\Adobe\\Adobe Creative Cloud\\ACC\\Creative Cloud.exe",
               "SubjectLogonId" => "0x10479bd9",
              "SubjectUserName" => "byute"
        },
        "computer_name" => "box.boxesofboxes.icu",
        "provider_name" => "Microsoft-Windows-Security-Auditing",
                 "task" => "Sensitive Privilege Use",
             "event_id" => "4673"
    },
           "log" => {
        "level" => "information"
    },
         "agent" => {
                  "id" => "42e866f3-77cc-4322-55aa-26235236",
                "name" => "box",
                "type" => "winlogbeat",
        "ephemeral_id" => "bfaab648-0f87-3244-abcd-2346326236",
             "version" => "8.8.1"
    },
           "ecs" => {
        "version" => "8.0.0"
    },
    "@timestamp" => 2023-06-15T14:44:20.273Z,
         "event" => {
        "provider" => "Microsoft-Windows-Security-Auditing",
         "outcome" => "failure",
         "created" => "2023-06-15T14:44:21.303Z",
          "action" => "Sensitive Privilege Use",
            "kind" => "event",

Can you share your full logstash configuration? The one you shared is not complete.

You shared that you have the tags _grokparsefailure and _dateparsefailure in one of your documents, but the logstash configuration does not show any grok or date filter.

Yes! Isn't that strange?? There are no grok or date parses going on!

Check it out:

$ ll
total 44
drwxr-xr-x. 2 root root    84 Jun 15 14:46 conf.d
-rw-r--r--. 1 root root  1833 Apr 20 06:47 jvm.options
-rw-r--r--. 1 root root  7437 Apr 20 06:47 log4j2.properties
-rw-r--r--. 1 root root   352 May  4 16:42 logstash-sample.conf
-rw-r--r--. 1 root root   342 Apr 20 06:47 logstash-sample.conf~
-rw-r--r--. 1 root root 15049 May  4 16:40 logstash.yml
-rw-r--r--. 1 root root   285 Apr 20 06:47 pipelines.yml
-rw-------. 1 root root  1696 Apr 20 06:47 startup.options
$ cat logstash.yml | grep -v '^#'
path.data: /var/lib/logstash






path.logs: /var/log/logstash
$ cat pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"
$ ll conf.d/
total 12
-rw-r--r--. 1 root root 506 Jun 15 14:46 winlogbeat-xxxxxxx.conf

Do you have other files in this path?

If you have other files in this path, in the way you are running logstash it will merge all the files in one big pipeline and all events will pass in all filters if they do not have any conditionals.

Update.. I think I'm learning that if you have two logstash confs, like stuff coming from two sets (one was syslogs on UDP, the other conf was handling winlogbeats)
that logstash isnt meant for that. Is that true?

Is it taking both logstash confs and smashing them together?

Original reply below.

Not on this logstash node. Its just receiving the winlogbeats.
That output is going into the elastic node that is on that same machine (part of the elastic cluster we have).

We do have another logstash node which receives network and storage logs on 5044 and 5045
and the output directive is the same:

elastic:9200

except that is pointing at node on the same system, which is another elastic node in our cluster.

If in your pipelines.yml you have a pipeline that is pointing to a path with multiple conf files in the way you shared, then Logstash will merge all the configuration files in that folder as just one pipeline.

If you want to run multiple pipelines independent from each other, you need to configure pipelines.yml to point to the correct configuration file as explained in this part of the documentation.