Aggregate Filter not keeping fields

Having many difficulties with Aggregate filter in logstash. Mainly I can create an aggregated event (although there are some errors there) but none of the fields are being populated except for the 2 I set myself.

I'm trying to aggregate windows login events. I don't want to get all of them, but would like to get the count by host or user every X minutes. I'm using the latest winlogbeat on a Windows 2012 R2 server. No problems getting individual windows events through with all the fields parsed properly.

Here is my logstash pipeline:

    input {
    beats {
        port => "5044"
    }
}
filter {
    mutate {
        # change fields to conform with ECS
        rename => { "[winlog][event_data][TargetUserName]" => "[client][user][name]" }
        lowercase => [ "[client][user][name]" ]
        rename => { "[winlog][event_data][TargetDomainName]" => "[client][user][domain]" }

        # fields for aggregation count and flag
        add_field => { 'count' => 0 }
        add_field => { 'Aggregation' => false }

        # copying this to an un-nested field for ease later
        add_field => { "hostname" => "%{[host][name]}" }
    }
    if [event][code] == 4624 {
        aggregate {
            task_id => "%{hostname}"
            code =>  "
                map['count'] ||= 1 
                map['count'] += 1
                event.cancel()
            "
            push_map_as_event_on_timeout => true
            timeout_task_id_field => "%{hostname}"
            timeout => 120
            timeout_code => "map['count'] = 0"
        }
    }
    else {
        drop {}
    }
}
output {
    stdout { codec => rubydebug }
}

This is the event that gets pushed through:

    {
     "@timestamp" => 2020-06-05T12:06:32.116Z,
           "tags" => [
        [0] "_aggregateexception"
    ],
       "@version" => "1",
    "%{hostname}" => "DC1.zorg.com",
          "count" => 1998
}

You'll also notice the count says 1998, which was after 2 logins. I also get the following error message:

[ERROR] 2020-06-05 08:06:32.120 [[main]>worker1] aggregate - Aggregate exception occurred {:error=>#<NameError: undefined local variable or method `map' for #<LogStash::Filters::Aggregate:0x6edeeb>>, :timeout_code=>"map['count'] = 0", :timeout_event_data=>{"@timestamp"=>2020-06-05T12:06:32.116Z, "%{hostname}"=>"DC1.zorg.com", "@version"=>"1", "count"=>1998}}

/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated

I have a feeling the response is going to be I have to set each field manually, but there's probably 50 fields with that single event type (logon), and there are many more events I want to aggregate for, so manually setting 1,000+ fields to handle every event type doesn't seem feasable.