Cannot find geolocation in Kibana Visualization

Hi

I am parsing IIS logs - I am using logstash 5.6.3

Edit: apparently this is known? https://github.com/logstash-plugins/logstash-filter-geoip/issues/123

How to go about using location as an array and not a hash?

I have created a grok filter and the information is retrieved correctly as can be seen in the screenshot below

1

I created a root level geo-ip in the index pattern and sent a PUT request:

2

and it seems successful:

sudo curl -XPUT "http://localhost:9200/_template/filebeat" -d@/etc/filebeat/filebeat.template.json
{"acknowledged":true}

There are no errors in the elasticsearch logs as well

But when I go to Kibana to create a visualization for this field, I cannot find it (I refreshed the field)

3

I checked the index pattern for this field and it's generated as a number field - maybe it needs to be changed in logstash?

5

Following is the logstash output sent to stdout - It seems like the problem is here with location - instead of putting the lat/long in an array in location variable, it's adding as "lot" and "lon" name/value pairs and this resolves to a number....

6

Looking at the docs here: https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html
It looks fine that lat/lon are the way they are in your example. Are you expecting to see lat/lon appear in the list of fields when creating the index pattern? I'm trying to understand what exactly is the problem.

thanks,
a5a

Please don't post pictures of text, they are difficult to read and some people may not be even able to see them :slight_smile:

sorry about that, I realize it now myself that it does look disorganized

Hi

The problem is that I started by following a pluralsight tutorial from 2015 - all was going well until I got to the geopoint location in maps - I saw the elastic docs after posting and edited at the top with someone else reporting the problem as well

I've managed to narrow down the problem I think

  1. Filebeat reads IIS logs and sends them to logstash

  2. I wrote a grok pattern in IIS to parse my logs - this is part of the pattern that does that

    %{IPORHOST:clienthost}
    
  3. logstash groks the text and sends it forward to elasticsearch - following is part of the message that's posted

    "timestamp" => "2017-06-27 20:22:50",
          "geoip" => {
               "timezone" => "Asia/Karachi",
                     "ip" => "119.160.66.236",
               "latitude" => 30.0,
           "country_name" => "Pakistan",
          "country_code2" => "PK",
         "continent_code" => "AS",
          "country_code3" => "PK",
               **"location" => {**
             **"lon" => 70.0,**
             **"lat" => 30.0**
         **}**,
              "longitude" => 70.0
     },
         "offset" => 76290130,
      "substatus" => "0",
    
  4. But Kibana's index patterns, instead of reading location as a geopoint, reads location.lat as a number and location.lon as a number

  5. Kibana's index patterns shows that the following are registered as geopoints:

    apache2.access.geoip.location 
    auditd.log.geoip.location  
    nginx.access.geoip.location
    system.auth.ssh.geoip.location  
    

But not geoip.location

  1. So essentially my question is, how do I register the geoip.location field as a geopoint

Thanks

Can you show the mapping of the index?

Hi Mark

I cannot attach here so I've pasted it in full here: https://pastebin.com/SHEuzDB8

Thanks for the help

Edit: in case it helps anyone else

  1. Fetch all indices with filebeat in their name:

    curl http://localhost:9200/_cat/indices?v | grep filebeat
    
  2. Find mapping for one of the indices (pick one)

     curl http://localhost:9200/filebeat-2017.10.301/_mapping

Also, in case it helps, here is the filter part in the beats.conf file used by logstash

 filter {
	if [type] == "syslog" {
		grok {
			match => {
				"message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
			}
		}
		date {
			match => ["syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"]
		}
	}

	if [type] == "iis_log" {
		if [message] = ~"^#" {
			drop {}
		}

		grok {
			match => {
				"message" => ["^%{TIMESTAMP_ISO8601:timestamp} %{IPV4:ip} %{WORD:verb} %{URIPATH:uripath} (%{NOTSPACE:uriparam}|-) %{NUMBER:port} (%{NOTSPACE:username}|-) %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NUMBER:status} %{NUMBER:substatus} %{NUMBER:scstatus} %{NUMBER:timetaken:int}$"]
			}
		}

		geoip {
			source => "clienthost"
		}

	}

}

Following up on my own question - I PUT the new template definition to ES but Kibana doesn't load it on refresh

I think this is a bug https://github.com/elastic/kibana/issues/14021

I will downgrade ES and Kibana and update here once I have confirmation

I'd suggest you split the sources out into their own indices, having so many different types of data in a single index can cause problems down the line - think mapping collisions, better source retention management etc.

Are you refreshing the mappings for the pattern in Management? If so the template won't be applied until you create a new index (ie UTC0000). You may also run into conflicts if you've done this unfortunately.

2 Likes

Thanks Mark - that makes sense now..

I'm not sure if it is due to your geoip configuration issue. If it is still doesn't work, you might want to try to replace it with IP2Location Filter which provides similar geolocation features.

Below is the setup instruction.

https://www.ip2location.com/tutorials/how-to-use-ip2location-filter-plugin-with-ELK

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.