Geo_point mapping problem

Hi guys, I am ELK noob willing to learn.
I've got a document

 [...]
 "_source": {
    "network": {
      "type": "ipv4"
    },
    "interface_name": "LAN",
    "destination": {
      "geo": {
        "country_name": "United States",
        "region_code": "CA",
        "continent_code": "NA",
        "city_name": "Los Angeles",
        "region_name": "California",
        "longitude": -118.2578,
        "latitude": 34.0549,
        "dma_code": 803,
        "location": {
          "lon": -118.2578,
          "lat": 34.0549
        },
        "ip": "185.236.xxx.xxx",
        "postal_code": "90014",
        "country_code3": "US",
        "timezone": "America/Los_Angeles",
        "country_code2": "US"
      },
      "as": {
        "number": 9009,
        "ip": "185.236.xxx.xxx",
        "organization": {
          "name": "xxx Ltd"
        }
      },
      "ip": "185.236.xxx.xxx",
      "port": "63915"
    },
[...]

and I do this in a Dev Tool:

PUT _template/pfsense
{
  "index_patterns": [
    "logstash-*"
  ],
  "mappings": {
    "properties": {
      "destination": {
        "properties": {
          "geo": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          },
          "as": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          }
        }
      },
      "source": {
        "properties": {
          "geo": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          },
          "as": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          }
        }
      }
    }
  }
}

it accepts, I re-create indexes and... No mapping. No GeoHash generated neither for Source, nor for Destination... What am I doing wrong?

Welcome!

Could you share the exact all steps and commands you are sending to elasticsearch?

Thanks @dadoonet for reply. I am not sure what do you want me to provide, but let me try:

  1. Stand-up the docker-compose stack of Elasticsearch, Logstash and Kibana
  2. Login to Kibana
  3. Dev Tool and PUT _template/pfsense as per first post.
  4. Index Pattern and create index pattern: logstash-*
  5. No mapping :frowning:

I can see documents are flowing and GeoIP works fine:

{
  "_index": "logstash-2020.03.03",
  "_type": "_doc",
  "_id": "q0eqn3ABHLWBg7p8ViMy",
  "_version": 1,
  "_score": null,
  "_source": {
    "tags": [
      "pf",
      "firewall",
      "GeoIP"
    ],
    "destination": {
      "geo": {
        "country_code2": "AU",
        "country_name": "Australia",
        "country_code3": "AU",
        "timezone": "Australia/Melbourne",
        "ip": "203.xxx.xxx.xxx",
        "location": {
          "lon": 145.069,
          "lat": -37.9258
        },
        "continent_code": "OC",
        "region_name": "Victoria",
        "city_name": "Bentleigh East",
        "region_code": "VIC",
        "longitude": 145.069,
        "latitude": -37.9258,
        "postal_code": "3165"
      },
      "as": {
        "organization": {
          "name": "TPG Telecom Limited"
        },
        "ip": "203.xxx.xxx.xxx",
        "number": 7545
      },
      "ip": "203.xxx.xxx.xxx",
      "port": "8000"
    },
    "@timestamp": "2020-03-03T09:11:45.355Z",
    "sequence_number": "2563024482",
    "direction": "inbound",
    "type": "syslog",
    "offset": "0",
    "event": {
      "original": "<134>Mar  3 20:12:41 filterlog: 98,,,1770009504,pppoe0,match,block,in,4,0x0,,240,54488,0,none,6,tcp,40,185.xxx.xxx.xxx,203.xxx.xxx.xxx,57865,8000,0,S,2563024482,,1024,,",
      "dataset": "firewall"
    },
    "received_from": "192.168.80.1",
    "interface": "pppoe0",
    "reason": "match",
    "source": {
      "geo": {
        "country_code2": "RU",
        "country_name": "Russia",
        "country_code3": "RU",
        "timezone": "Europe/Moscow",
        "latitude": 55.7386,
        "ip": "185.xxx.xxx.xxx",
        "location": {
          "lon": 37.6068,
          "lat": 55.7386
        },
        "continent_code": "EU",
        "longitude": 37.6068
      },
      "as": {
        "organization": {
          "name": "SS-Net"
        },
        "ip": "185.xxx.xxx.xxx",
        "number": 204428
      },
      "ip": "185.xxx.xxx.xxx",
      "port": "57865"
    },
    "flags": "none",
    "length": "40",
    "tracker": "1770009504",
    "transport": {
      "data_length": "0"
    },
    "interface_name": "PPPoE",
    "action": "block",
    "tcp_flags": "S",
    "rule_number": "98",
    "@version": "1",
    "id": "54488",
    "protocol": {
      "type": "tcp",
      "id": "6"
    },
    "window_size": "1024",
    "tos": "0x0",
    "pf_message": "98,,,1770009504,pppoe0,match,block,in,4,0x0,,240,54488,0,none,6,tcp,40,185.xxx.xxx.xxx,203.xxx.xxx.xxx,57865,8000,0,S,2563024482,,1024,,",
    "host": "192.168.80.1",
    "pf_timestamp": "Mar  3 20:12:41",
    "ttl": "240",
    "received_at": "2020-03-03T09:11:45.355Z",
    "network": {
      "type": "ipv4"
    },
    "pf_program": "filterlog"
  },
  "fields": {
    "received_at": [
      "2020-03-03T09:11:45.355Z"
    ],
    "@timestamp": [
      "2020-03-03T09:11:45.355Z"
    ]
  },
  "sort": [
    1583226705355
  ]
}

The question is - how can I map the Source and Destination Geo as geo_point, so I can present them on a map?

Hope this explains.

I see. So you think that creating an index template in elasticsearch is "enough"?

The index template is used when an index is created if the index name matches the index pattern.
It is not applied to any existing index.

If you run for example:

DELETE logstash-test
PUT logstash-test
GET logstash-test/_mapping

You should see that the mapping has been correctly set.

So you need to delete the existing indices and recreate them.

My 2 cents.

Thank you @dadoonet, this is exactly what I was thinking!

For all the poor souls trapped into the same delusion as I have, this is the process which I have followed:

  1. I have deleted all existing indices.
  2. Started Elasticsearch and Kibana (with no Logstash).
  3. Login to Kibana.
  4. Dev Tool and PUT _template/pfsense as per first post.
  5. Restarted whole stack.

All mapping are in place and everything work as expected.

Thank you David once again :pray:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.