Logstash geoIP and dissect issues

I'm having a couple issues, not sure if its syntax or what. End result is that I want to read in a 24 hour old .txt file, mimic it being read in as live data, extract an outside IP address from the message field (greedydata field) and use that with geoip to put the IP address on a map in kibana. The rest of the data in each entry is irrelevant. Here is my config file:
input {
file {
path => "//ntsvc/Logs/SyslogCatchAll-2017-10-22.txt"
start_position => "beginning"
sincedb_path => "NUL"
ignore_older => 0
}

filter {
throttle {
period => 1
max_age => 3
before_count => 3
after_count => 20
key => "%{message}"
#this is the only way I have found to somewhat mimic live is to limit entries per second
}
grok {
named_captures_only => 'false'
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}Local4.%{LOGLEVEL:level}%{SPACE}%{IP:ourIP}%{SPACE}%{CISCOTIMESTAMP:time}: %%{CISCOTAG:asa}%{GREEDYDATA:messageEnd}" }
}
if "outside" in [messageEnd] {
dissect {
mapping => {
"message" => "%{?outside:}%[IP:ip]: %{msg}"
}
#attempting to use outside: as a delimiter, store the next portion as IP address, store the rest as msg
}
} else {
drop{ }
}
#drop all entries without an outside IP address
geoIP {
source => "ip"
}
}

output {
elasticsearch {
hosts => [localhost:9200]
}
}

GeoIP is not working. I could not quite understand how to use the dissect filter after reading the documentation, so I have a very strong feeling I have that portion incorrect.

Thanks for any help!

This is basically the same question as Logstash Dissect Filter.

As Magnus mentions, it would be better if you update the original thread with this extra info as it helps others understand the problem and what you've already worked through :slight_smile: