I'm not seeing any geoip data from my zeek logs in my SIEM map

I'm using Filebeats 7.3.0 to ingest Zeek network logs into Elasticsearch 7.3.0, but I'm not seeing any geoip fields created in my indexes.
The Zeek ingest module automatically created pipelines for each of the different log file types; http, ssl, dns, etc..

Here is my filebeat yml:
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
hosts: "http://[private natted ip]:9200"
pipeline: filebeat-7.3.0-zeek-http-pipeline

Here is my pipeline for filebeat-7.3.0-zeek-http-pipeline:
{
"filebeat-7.3.0-zeek-http-pipeline" : {
"description" : "Pipeline for normalizing Zeek http.log",
"processors" : [
{
"script" : {
"source" : "ctx.event.created = ctx['@timestamp']; ctx['@timestamp'] = (long)ctx['zeek']['http']['ts'] * 1000; ctx.zeek.http.remove('ts');",
"lang" : "painless"
}
},
{
"set" : {
"field" : "event.id",
"value" : "{{zeek.session_id}}",
"if" : "ctx.zeek.session_id != null"
}
},
{
"set" : {
"field" : "source.ip",
"value" : "{{source.address}}"
}
},
{
"set" : {
"field" : "destination.ip",
"value" : "{{destination.address}}"
}
},
{
"set" : {
"field" : "url.port",
"value" : "{{destination.port}}"
}
},
{
"geoip" : {
"field" : "destination.ip",
"target_field" : "destination.geo"
}
},
{
"geoip" : {
"field" : "source.ip",
"target_field" : "source.geo"
}
},
{
"user_agent" : {
"field" : "user_agent.original",
"ignore_missing" : true
}
}
],
"on_failure" : [
{
"set" : {
"field" : "error.message",
"value" : "{{ _ingest.on_failure_message }}"
}
}
]
}
}

Here is a sample result of doc with missing geoip fields:
{
"_index" : "filebeat-7.3.0-2019.08.09-000001",
"_type" : "_doc",
"_id" : "FTYveGwBUFrAUZ51xToN",
"_version" : 1,
"_seq_no" : 122365,
"_primary_term" : 1,
"found" : true,
"_source" : {
"agent" : {
"hostname" : "zeek-server",
"id" : "d9b26eb5-e10c-48c5-919a-254ffed87cb9",
"type" : "filebeat",
"ephemeral_id" : "67c89f50-8910-4366-997c-4387da92508a",
"version" : "7.3.0"
},
"log" : {
"file" : {
"path" : "/usr/local/bro/spool/logger/http.log"
},
"offset" : 523009
},
"destination" : {
"address" : "35.171.236.97",
"port" : 80
},
"zeek" : {
"session_id" : "C9V9Al4BAB3hboznb8",
"http" : {
"resp_mime_types" : [
"text/json"
],
"trans_depth" : 1,
"status_msg" : "OK",
"ts" : "2019-08-09T21:01:30.272945Z",
"resp_fuids" : [
"FUEzx03onGBuCa9xQk"
],
"tags" :
}
},
"source" : {
"address" : "[internal natted ip redacted]",
"port" : 63052
},
"fileset" : {
"name" : "http"
},
"error" : {
"message" : "cannot explicitly cast def [java.lang.String] to long"
},
"network" : {
"community_id" : "1:h8xEg+e7aCg3NNU48d2V7rublBk=",
"transport" : "tcp"
},
"tags" : [
"zeek.http"
],
"input" : {
"type" : "log"
},
"@timestamp" : "2019-08-09T21:01:31.210Z",
"ecs" : {
"version" : "1.0.1"
},
"service" : {
"type" : "zeek"
},
"host" : {
"name" : "vhsbns"
},
"http" : {
"request" : {
"body" : {
"bytes" : 0
}
},
"response" : {
"status_code" : 200,
"body" : {
"bytes" : 33
}
},
"version" : "1.1"
},
"event" : {
"created" : "2019-08-09T21:01:31.210Z",
"module" : "zeek",
"dataset" : "zeek.http"
}
}
}

What am i doing wrong?

I'm not terribly familiar with Zeek, but this parsing error happened because zeek.http.ts is 2019-08-09T21:01:30.272945Z instead of the expected epoch time.

Have you changed anything in the Zeek configuration before getting these logs?

cwurm,
That was it!
Yes, I changed Zeek logs to convert time to universal time format. Once I removed that from Zeek all my geoip data started to process as expected.
Thank you very much for help.