ELK 5.2.1 - still can't get geoip working

Hello,

I was already asking about this on Jan 12, but with no solution. Meanwhile I have upgraded to ELK 5.2.1 so I tried again, but still can't get geoip working.

The logstash filter is minimalistic:
geoip {
source => "[attrs][source]"
}

The mapping is:
"geoip":{"dynamic":false,"type":"object","properties":{"ip":{"type":"ip"},"latitude":{"type":"float"},"location":{"type":"geo_point"},"longitude":{"type":"float"}}}

But still I get this error:

[2017-02-24T16:43:29,147][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2017.02.24", :_type=>"reg-new", :_routing=>nil}, 2017-02-24T15:43:28.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"logstash-2017.02.24", "_type"=>"reg-new", "id"=>"AVpwypVBfx5AMx9veYw", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

Using the rubydebug output I can see data like this:

{
"@timestamp" => 2017-02-24T16:00:51.000Z,
"geoip" => {
"timezone" => "America/New_York",
"ip" => "208.108.139.10",
"latitude" => 39.9553,
"continent_code" => "NA",
"city_name" => "Zanesville",
"country_code2" => "US",
"country_name" => "United States",
"dma_code" => 596,
"country_code3" => "US",
"region_name" => "Ohio",
"location" => [
[0] -82.0718,
[1] 39.9553
],
"postal_code" => "43701",
"longitude" => -82.0718,
"region_code" => "OH"
},
"@version" => "1",
"id" => "14a4b651b612f835",
"type" => "reg-new",
[truncated]

Thanks in advance for any help...

Can you check the ES logs as well?

Here is what can be seen in ES log. (I had to truncate it to be able to post here, as unfortunately it is not possible to attach txt file.)

[2017-02-27T14:17:10,489][DEBUG][o.e.a.b.TransportShardBulkAction] [9qMaf0O] [logstash-2017.02.27][4] failed to execute bulk item (index) index {[logstash-2017.02.27][reg-new][AVp_t7XZ4O0lARf556iU], source[{"@timestamp":"2017-02-27T13:17:10.000Z","geoip":{"timezone":"Europe/Vienna","ip":"178.114.232.210","latitude":48.2,"country_code2":"AT","country_name":"Austria","continent_code":"EU","country_code3":"AT","location":[16.3667,48.2],"longitude":16.3667},"@version":"1","id":"ac92cd7a7be9b357","type":"reg-new","ts":1488201430}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
at org.elasticsearch.index.mapper.DocumentParser.wrapInMapperParsingException(DocumentParser.java:175) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:69) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:275) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:533) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.shard.IndexShard.prepareIndexOnPrimary(IndexShard.java:510) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.index.TransportIndexAction.prepareIndexOperationOnPrimary(TransportIndexAction.java:196) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.index.TransportIndexAction.executeIndexRequestOnPrimary(TransportIndexAction.java:201) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:348) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.bulk.TransportShardBulkAction.index(TransportShardBulkAction.java:155) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.bulk.TransportShardBulkAction.handleItem(TransportShardBulkAction.java:134) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.bulk.TransportShardBulkAction.onPrimaryShard(TransportShardBulkAction.java:120) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.bulk.TransportShardBulkAction.onPrimaryShard(TransportShardBulkAction.java:73) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportWriteAction.shardOperationOnPrimary(TransportWriteAction.java:76) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportWriteAction.shardOperationOnPrimary(TransportWriteAction.java:49) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryShardReference.perform(TransportReplicationAction.java:914) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryShardReference.perform(TransportReplicationAction.java:884) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.ReplicationOperation.execute(ReplicationOperation.java:113) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.onResponse(TransportReplicationAction.java:327) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.onResponse(TransportReplicationAction.java:262) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$1.onResponse(TransportReplicationAction.java:864) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$1.onResponse(TransportReplicationAction.java:861) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.shard.IndexShardOperationsLock.acquire(IndexShardOperationsLock.java:142) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.shard.IndexShard.acquirePrimaryOperationLock(IndexShard.java:1652) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction.acquirePrimaryShardReference(TransportReplicationAction.java:873) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction.access$400(TransportReplicationAction.java:92) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.doRun(TransportReplicationAction.java:279) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:258) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:250) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:69) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.transport.TransportService$7.doRun(TransportService.java:610) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:596) [elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-5.2.1.jar:5.2.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_121]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_121]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_121]
Caused by: org.elasticsearch.ElasticsearchParseException: geo_point expected
at org.elasticsearch.common.geo.GeoUtils.parseGeoPoint(GeoUtils.java:456) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.BaseGeoPointFieldMapper.parse(BaseGeoPointFieldMapper.java:558) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:449) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.parseDynamicValue(DocumentParser.java:800) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:582) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.parseNonDynamicArray(DocumentParser.java:564) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.parseArray(DocumentParser.java:525) ~[elasticsearch-5.2.1.jar:5.2.1]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:384) ~[elasticsearch-5.2.1.jar:5.2.1]

That geo_point expected error message indicates that the location field is not formatted correctly, or there is some strange mapping collision for one or more records going through. The one you've pasted has it:

"location" => [
[0] -82.0718,
[1] 39.9553
]

Even the Elasticsearch log indicates a correct field:

{"@timestamp":"2017-02-27T13:17:10.000Z","geoip":{"timezone":"Europe/Vienna","ip":"178.114.232.210","latitude":48.2,"country_code2":"AT","country_name":"Austria","continent_code":"EU","country_code3":"AT","location":[16.3667,48.2],"longitude":16.3667},"@version":"1","id":"ac92cd7a7be9b357","type":"reg-new","ts":1488201430}]}

notably: "location":[16.3667,48.2]

This is an acceptable geo_point address, so far as I can tell. I'm not sure why Elasticsearch is responding with a collision. Are you just using the index template that comes with Logstash? (If you don't know, then the answer is yes).

I'm using a custom index template (using 'template => "/var/lib/logstash/logstash-elasticsearch-template.json"' option of elasticsearch output plugin in logstash).

Here is the complete index template:

{
  "template" : "*",
  "settings" : {
      "index" : {
          "refresh_interval" : "30s",
          "analysis": {
              "analyzer": {
                  "uri_analyzer": {
                      "type": "custom",
                      "tokenizer": "whitespace",
                      "filter": ["stop", "uri_stop"]
                  },
                  "usernameNgram": {
                      "type": "custom",
                      "filter": "lowercase",
                      "tokenizer": "customNgram"
                  }
              },
              "filter": {
                  "uri_stop": {
                      "type": "stop",
                      "stopwords": ["sip", "http", "https"]
                  }
              },
              "tokenizer": {
                  "customNgram": {
                      "min_gram": "3",
                      "type": "nGram",
                      "max_gram": "20"
                  }
              }
          }
      }
  },
  "mappings" : {
    "_default_" : {
        "_all": { "enabled":  false }
    },
    "event" : {
       "dynamic_templates" : [ {
         "full_text_fields" : {
           "path_match" : "attrs.details.*",
           "match_mapping_type" : "string",
           "mapping" : {
             "type" : "string", "index" : "analyzed", "omit_norms" : true,
               "fields" : {
                 "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 1024}
               }
           }
         }
       }, {
         "event_fields" : {
           "path_match" : "attrs.*",
           "match_mapping_type" : "string",
           "mapping" : {
               "type" : "string",
               "index" : "not_analyzed",
               "omit_norms" : true
           }
         }
       } ],
       "properties" : {
         "@version": { "type": "string", "index": "not_analyzed" },
         "type": { "type": "string", "index": "not_analyzed" },
         "attrs" : {
             "properties" : {
                 "from": {
                     "type" : "string",
                     "index" : "not_analyzed",
                     "fields" : {
                         "ngram" : {
                             "type" : "string",
                             "analyzer": "usernameNgram"
                         }
                     }
                 },
                 "to": {
                     "type" : "string",
                     "index" : "not_analyzed",
                     "fields" : {
                         "ngram" : {
                             "type" : "string",
                             "analyzer": "usernameNgram"
                         }
                     }
                 },
                 "reason": {
                     "type" : "string",
                     "fields" : {
                         "exact" : {
                             "type" : "string",
                             "index" : "not_analyzed"
                         }
                     }
                 },
                 "duration": {
                     "type" : "float",
                      "index" : "not_analyzed"
                 }
             }
         },
	 "geoip":{
		"dynamic":false,
		"type":"object",
		"properties":{
			"ip":{"type":"ip"},
			"latitude":{"type":"float"},
			"location":{"type":"geo_point"},
			"longitude":{"type":"float"}
		}
	},
        "host": { "type": "string" },
        "plugin": { "type": "string", "index": "not_analyzed" },
        "type_instance": { "type": "string", "index": "not_analyzed" },
        "collectd_type": { "type": "string", "index": "not_analyzed" }
       }
    }
  }
}

@paka I've edited your post for you, but please remember to encapsulate pre-formatted text with triple back-tick characters, e.g. ``` on its own line, your text, and then ``` on the next line, by itself.

I can't see anything wrong there, but that's just a cursory look at it. Are there any lines going through Logstash which do not contain the geoip structure? I ask, because the error looks like something is missing:

2017-02-24T15:43:28.000Z %{host} %{message}

That looks empty.

1 Like

Otherwise, my next guess is to try sending to the default template and a logstash-* named index and see what happens.

1 Like

Thanks for the hints, and sorry for the delay.

I tried the following:

  • still minimalistic geoip filter in logstash:
               	geoip {
                       	source => "[attrs][source]"
                }
  • used new output index name logstash-test, with no custom mapping:
	elasticsearch {
                index => "logstash-test-%{+YYYY.MM.dd}"
        }

but still get the error:

[2017-03-20T10:06:34,179][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-test-2017.03.20", :_type=>"reg-new", :_routing=>nil}, 2017-03-20T09:06:34.000Z %{host} %{message}], :response=>{"index"=>{"_index"=>"logstash-test-2017.03.20", "_type"=>"reg-new", "_id"=>"AVrq99I_X48c2rfE0TC_", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

Yes there are some messages that do not contain the geoip, the filter is applied only to some.

can you pastebin the mapping? It sounds as though something is going wrong with some of your fields, or something.

1 Like

Here is the complete mapping template file.
But I get the same error even if I use no custom mapping and index name logstash* (in which case I hope default mapping should be used).
There has to be something basic I'm missing :slight_smile:

{
  "template" : "*",
  "settings" : {
      "index" : {
          "refresh_interval" : "30s",
          "analysis": {
              "analyzer": {
                  "uri_analyzer": {
                      "type": "custom",
                      "tokenizer": "whitespace",
                      "filter": ["stop", "uri_stop"]
                  },
                  "usernameNgram": {
                      "type": "custom",
                      "filter": "lowercase",
                      "tokenizer": "customNgram"
                  }
              },
              "filter": {
                  "uri_stop": {
                      "type": "stop",
                      "stopwords": ["sip", "http", "https"]
                  }
              },
              "tokenizer": {
                  "customNgram": {
                      "min_gram": "3",
                      "type": "nGram",
                      "max_gram": "20"
                  }
              }
          }
      }
  },
  "mappings" : {
    "_default_" : {
        "_all": { "enabled":  false }
    },
    "event" : {
       "dynamic_templates" : [ {
         "full_text_fields" : {
           "path_match" : "attrs.details.*",
           "match_mapping_type" : "string",
           "mapping" : {
             "type" : "string", "index" : "analyzed", "omit_norms" : true,
               "fields" : {
                 "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 1024}
               }
           }
         }
       }, {
         "event_fields" : {
           "path_match" : "attrs.*",
           "match_mapping_type" : "string",
           "mapping" : {
               "type" : "string",
               "index" : "not_analyzed",
               "omit_norms" : true
           }
         }
       } ],
       "properties" : {
         "@version": { "type": "string", "index": "not_analyzed" },
         "type": { "type": "string", "index": "not_analyzed" },
         "attrs" : {
             "properties" : {
                 "from": {
                     "type" : "string",
                     "index" : "not_analyzed",
                     "fields" : {
                         "ngram" : {
                             "type" : "string",
                             "analyzer": "usernameNgram"
                         }
                     }
                 },
                 "to": {
                     "type" : "string",
                     "index" : "not_analyzed",
                     "fields" : {
                         "ngram" : {
                             "type" : "string",
                             "analyzer": "usernameNgram"
                         }
                     }
                 },
                 "reason": {
                     "type" : "string",
                     "fields" : {
                         "exact" : {
                             "type" : "string",
                             "index" : "not_analyzed"
                         }
                     }
                 },
                 "duration": {
                     "type" : "float",
                      "index" : "not_analyzed"
                 }
             }
         },
	 "geoip":{
		"dynamic":false,
		"type":"object",
		"properties":{
			"ip":{"type":"ip"},
			"latitude":{"type":"float"},
			"location":{"type":"geo_point"},
			"longitude":{"type":"float"}
		}
	},
        "host": { "type": "string" },
        "plugin": { "type": "string", "index": "not_analyzed" },
        "type_instance": { "type": "string", "index": "not_analyzed" },
        "collectd_type": { "type": "string", "index": "not_analyzed" }
       }
    }
  }
}

Update: tried again on a new frash installation, using newer ELK 5.3 versions - still not working.

It seems strange that such a basic thing is still not working, even if trying a very basic configuration according to doc, and fighting with that already for a quarter of year :frowning:

This is the template Logstash 5.3.0 put into Elasticsearch 5.3.0:

{
  "logstash" : {
    "order" : 0,
    "version" : 50001,
    "template" : "logstash-*",
    "settings" : {
      "index" : {
        "refresh_interval" : "5s"
      }
    },
    "mappings" : {
      "_default_" : {
        "dynamic_templates" : [
          {
            "message_field" : {
              "path_match" : "message",
              "mapping" : {
                "norms" : false,
                "type" : "text"
              },
              "match_mapping_type" : "string"
            }
          },
          {
            "string_fields" : {
              "mapping" : {
                "norms" : false,
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword"
                  }
                }
              },
              "match_mapping_type" : "string",
              "match" : "*"
            }
          }
        ],
        "_all" : {
          "norms" : false,
          "enabled" : true
        },
        "properties" : {
          "@timestamp" : {
            "include_in_all" : false,
            "type" : "date"
          },
          "geoip" : {
            "dynamic" : true,
            "properties" : {
              "ip" : {
                "type" : "ip"
              },
              "latitude" : {
                "type" : "half_float"
              },
              "location" : {
                "type" : "geo_point"
              },
              "longitude" : {
                "type" : "half_float"
              }
            }
          },
          "@version" : {
            "include_in_all" : false,
            "type" : "keyword"
          }
        }
      }
    },
    "aliases" : { }
  }
}

With this config:

input { stdin {} }

filter {
  mutate { add_field => { "[attrs][source]" => "8.8.8.8" } }
  geoip { source => "[attrs][source]" }
}

output {
  stdout { codec => rubydebug }
  elasticsearch { }
}

I get this output:

Howdy
{
    "@timestamp" => 2017-04-07T15:31:50.149Z,
         "geoip" => {
              "timezone" => "America/Los_Angeles",
                    "ip" => "8.8.8.8",
              "latitude" => 37.386,
        "continent_code" => "NA",
             "city_name" => "Mountain View",
         "country_code2" => "US",
          "country_name" => "United States",
              "dma_code" => 807,
         "country_code3" => "US",
           "region_name" => "California",
              "location" => [
            [0] -122.0838,
            [1] 37.386
        ],
           "postal_code" => "94035",
             "longitude" => -122.0838,
           "region_code" => "CA"
    },
      "@version" => "1",
          "host" => "REDACTED",
       "message" => "Howdy",
         "attrs" => {
        "source" => "8.8.8.8"
    }
}
1 Like

I get no errors from Elasticsearch. The template and mapping look fine:

# Snipped...
          "geoip" : {
            "dynamic" : "true",
            "properties" : {
              "ip" : {
                "type" : "ip"
              },
              "latitude" : {
                "type" : "half_float"
              },
              "location" : {
                "type" : "geo_point"
              },
              "longitude" : {
                "type" : "half_float"
              }

I can confirm that your template won't work, however. Your mapping is somehow flagging incorrectly.

More testing is required, but I can tell you that your mapping syntax is deprecated. not_analyzed is no longer used.

1 Like

Aha! It works now. Your mapping puts the geo_point underneath the mapping _type of event. If you do not send the event through Logstash with the correct document_type, then the mapping template will not apply, as the default document_type in Logstash is logs.

With this configuration:

input { stdin {} }

filter {
  mutate {
    add_field => { "[attrs][source]" => "8.8.8.8" }
  }
  geoip { source => "[attrs][source]" }
}

output {
  stdout { codec => rubydebug }
  elasticsearch {
    manage_template => false 
   # This prevents the default Logstash template from interfering with your mapping 
   # template, which I manually applied
  }
}

I get this output:

This will fail
{
    "@timestamp" => 2017-04-07T15:47:51.879Z,
         "geoip" => {
              "timezone" => "America/Los_Angeles",
                    "ip" => "8.8.8.8",
              "latitude" => 37.386,
        "continent_code" => "NA",
             "city_name" => "Mountain View",
         "country_code2" => "US",
          "country_name" => "United States",
              "dma_code" => 807,
         "country_code3" => "US",
           "region_name" => "California",
              "location" => [
            [0] -122.0838,
            [1] 37.386
        ],
           "postal_code" => "94035",
             "longitude" => -122.0838,
           "region_code" => "CA"
    },
      "@version" => "1",
          "host" => "thunderbolt-display.untergeek.net",
       "message" => "This will fail",
         "attrs" => {
        "source" => "8.8.8.8"
    }
}
[2017-04-07T09:47:52,031][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2017.04.07", :_type=>"logs", :_routing=>nil}, 2017-04-07T15:47:51.879Z thunderbolt-display.untergeek.net This will fail], :response=>{"index"=>{"_index"=>"logstash-2017.04.07", "_type"=>"logs", "_id"=>"AVtJGbBDIxmgySnf2h_6", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

This what you've been seeing: The geo_point expected error.

Now I change the configuration to add this:

input {
  stdin { type => "event" }
}

filter {
  mutate {
    add_field => { "[attrs][source]" => "8.8.8.8" }
  }
  geoip { source => "[attrs][source]" }
}

output {
  stdout { codec => rubydebug }
  elasticsearch {
    manage_template => false
  }
}

Not only do I get no error:

This will work
{
    "@timestamp" => 2017-04-07T15:51:12.303Z,
         "geoip" => {
              "timezone" => "America/Los_Angeles",
                    "ip" => "8.8.8.8",
              "latitude" => 37.386,
        "continent_code" => "NA",
             "city_name" => "Mountain View",
         "country_code2" => "US",
          "country_name" => "United States",
              "dma_code" => 807,
         "country_code3" => "US",
           "region_name" => "California",
              "location" => [
            [0] -122.0838,
            [1] 37.386
        ],
           "postal_code" => "94035",
             "longitude" => -122.0838,
           "region_code" => "CA"
    },
      "@version" => "1",
          "host" => "REDACTED",
       "message" => "This will work",
          "type" => "event",
         "attrs" => {
        "source" => "8.8.8.8"
    }
}
^C[2017-04-07T09:51:14,675][WARN ][logstash.runner          ] SIGINT received. Shutting down the agent.
[2017-04-07T09:51:14,683][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}

I also can query that document:

curl -XGET localhost:9200/_search?pretty -d '{"query":{ "match_all" : {} } }'
{
  "took" : 31,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "failed" : 0
  },
  "hits" : {
    "total" : 1,
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "logstash-2017.04.07",
        "_type" : "event",
        "_id" : "AVtJHL_tqkbjadN9d_gn",
        "_score" : 1.0,
        "_source" : {
          "@timestamp" : "2017-04-07T15:51:12.303Z",
          "geoip" : {
            "timezone" : "America/Los_Angeles",
            "ip" : "8.8.8.8",
            "latitude" : 37.386,
            "continent_code" : "NA",
            "city_name" : "Mountain View",
            "country_code2" : "US",
            "country_name" : "United States",
            "dma_code" : 807,
            "country_code3" : "US",
            "region_name" : "California",
            "location" : [
              -122.0838,
              37.386
            ],
            "postal_code" : "94035",
            "longitude" : -122.0838,
            "region_code" : "CA"
          },
          "@version" : "1",
          "host" : "REDACTED",
          "message" : "This will work",
          "type" : "event",
          "attrs" : {
            "source" : "8.8.8.8"
          }
        }
      }
    ]
  }
}

Note that _type is "event". This is why your mapping is failing to do GeoIP properly.

2 Likes

Thanks a lot !

I'm trying it now. The trouble seems that our event data coming in already contains "type" field (used for something else, containing a varying different values), and even if I add

type => "event"

into the input plugin settings (we use redis input plugin), it does not help as the already existing "type" field in data coming in seems to have higher precedence, and both "type" and "_type" fields are set to that instead of "event", and then the mapping does not take effect.

Looks like a clean solution would be changing the source side to use a different field name, not "type".

The clean solution would actually be to update your mapping to not be nested under type "event", but at the root level of the mapping, so that it applies to any type, not just event. This is how it's mapped in the default Logstash template.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.