Two GeoIP sources and target locations

Hello, let me first say that I appreciate any help that can be provided around this issue I have been fighting with.

Background:
I am loading in firewall logs into a 5.6 ELK Stack running on Windows. The information is mainly been filtered into .csv format and everything seems to being searchable and working as I would hope. The last thing I would like to do is have the source address and destination address geo-locations be displayed in the map coordinates visualization.

I've tried quite a bit to get the geoip filter to work as needed. After reading through the Logstash filter plugin documentation for the geoip more extensively I discovered that by not supplying a target for filter I could get the geoip default target to work with the Kibana visualization as this would fix the target to be of type geo_point.

The problem I am still experiencing is, I am unable to filter in a the second bit of information. From previous forum/questions that have been posted on this website, I did find that supposedly by adding another index template i can fix this issue.

I should say that I do not really have any knowledge or experience working with elasticsearch. So it would be very helpful if someone give give a few step by step instructions on how I can do this. I have visited Kibana's Dev-Tools page and tried running the following query but i continue to get this same error.

query:
GET:
GET _template/*

Response:
{
"logstash": {
"order": 0,
"version": 50001,
"template": "logstash-",
"settings": {
"index": {
"refresh_interval": "5s"
}
},
"mappings": {
"default": {
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"mapping": {
"norms": false,
"type": "text"
},
"match_mapping_type": "string"
}
},
{
"string_fields": {
"mapping": {
"norms": false,
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"match_mapping_type": "string",
"match": "
"
}
}
],
"_all": {
"norms": false,
"enabled": true
},
"properties": {
"@timestamp": {
"include_in_all": false,
"type": "date"
},
"geoip": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"latitude": {
"type": "half_float"
},
"location": {
"type": "geo_point"
},
"longitude": {
"type": "half_float"
}
}
},
"@version": {
"include_in_all": false,
"type": "keyword"
}
}
}
},
"aliases": {}
}
}

I believe from what I've read that I need to mimic this section and add it on to this template:
"geoip": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"latitude": {
"type": "half_float"
},
"location": {
"type": "geo_point"
},
"longitude": {
"type": "half_float"
}
}
},

Now I think I need to simply run this code below:
PUT:
PUT _template/geo_src
{
"geo_src": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"latitude": {
"type": "half_float"
},
"location": {
"type": "geo_point"
},
"longitude": {
"type": "half_float"
}
}
},
}

But I am receiving this error:
Error:
{
"error": {
"root_cause": [
{
"type": "parse_exception",
"reason": "Failed to parse content to map"
}
],
"type": "parse_exception",
"reason": "Failed to parse content to map",
"caused_by": {
"type": "json_parse_exception",
"reason": "Unexpected character ('}' (code 125)): was expecting double-quote to start field name\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@7e8c4b04; line: 19, column: 2]"
}
},
"status": 400
}

Am I completely misunderstanding what I need to do to fix this issue? Thank you to everyone in advance.

-Marcin

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.