Parsing with logstash shows correct with debug but unable to ingest

Hi Everyone,

I am struggling to ingest messages using logstash, I have been working on the parser for lst 2 weeks but unable to succeed. Grok debugger and rubydebug perfectly show that messages are correctly parsed however those eventually does not get ingested. Can someone please help?

Here are the messages.

2024-12-25 05:38:08,566 - INFO - Login attempt: Username=monitor, IP=192.168.1.3
2024-12-25 05:38:08,566 - IP: 192.168.1.3 - Username: monitor, Password: monitor
2024-12-25 05:38:08,567 - INFO - 192.168.1.3 - - [25/Dec/2024 05:38:08] "POST /login HTTP/1.1" 200 -
2024-12-25 05:38:50,121 - INFO - 192.168.1.3 - - [25/Dec/2024 05:38:50] "GET / HTTP/1.1" 200 -
2024-12-25 05:38:54,117 - INFO - Login attempt: Username=testing, IP=192.168.1.3
2024-12-25 05:38:54,117 - IP: 192.168.1.3 - Username: testing, Password: testing
2024-12-25 05:38:54,118 - INFO - 192.168.1.3 - - [25/Dec/2024 05:38:54] "POST /login HTTP/1.1" 200 -

And here are my parsers

if [type] == "FGT" {
    # For Login Attempt
    if [message] =~ "Login attempt" {
        grok {
            match => {
                "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second},%{INT:millisecond} - %{LOGLEVEL:log_level} - Login attempt: Username=%{DATA:username}, IP=%{IP:src_ip}"
            }
        }
        mutate {
            add_field => { "timestamp" => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second},%{millisecond}" }
            remove_field => ["year", "month", "day", "hour", "minute", "second", "millisecond"]
        }
        date {
            match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS"]
            target => "@timestamp"
        }
        mutate {
            add_field => { "event_type" => "login_attempt" }
            remove_field => ["message", "timestamp"]
        }
    }

    # For Credentials Provided
    else if [message] =~ "Password:" {
        grok {
            match => {
                "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second},%{INT:millisecond} - IP: %{IP:src_ip} - Username: %{DATA:username}, Password: %{DATA:password}"
            }
        }
        mutate {
            add_field => { "timestamp" => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second},%{millisecond}" }
            remove_field => ["year", "month", "day", "hour", "minute", "second", "millisecond"]
        }
        date {
            match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS"]
            target => "@timestamp"
        }
        mutate {
            add_field => { "event_type" => "credentials_provided" }
            remove_field => ["message", "timestamp"]
        }
    }

    # Drop all other log lines
    else {
        drop { }
    }
}

Is this sample a real lines from the log? I mean raw lines, not transformed.
Maybe is better to split message on <the date part - > and rest or line by the dissect parser. Then compare if starts with "IP:" it's username&pass line ( if [messagepart2]=~ /^IP/), others are like a web access records.

Also you can additionally compare with /^(INFO|DEBUG|ERROR)/ , what are possible values, to have stronger condition.

Resolved by modifying the actual codes and removed non-parsing entries.