GeoIP is not discoverable by elasticsearch and kibana

Hello Guyz,

I am currently working on ELK stack version 5.6.3.

I have JSON logs which contains 2 level json. Which can be consider as nested JSON in technical terms. This log contain geoip coordinates.

I have written a template for the same and have provided the path of template in the Output however, I am unable to capture the geo coordinates as geo hash on kibana coordinate visualization.

this is my sample log:

{"timestamp": "2014-10-15T22:57:47.357975", "isp": "Spacenet", "data": "Product name: 1747-L551/C C/10 - DC 3.46 \nVendor ID: Rockwell Automation/Allen-Bradley\nSerial number: 0xbc3705b8\nDevice type: Communications Adapter\nDevice IP: 192.168.1.10", "port": 44818, "hostnames": ["misc-148-65-113-11.pool.starband.net"], "location": {"city": null, "region_name": null, "area_code": null, "longitude": -97.0, "country_code3": "USA", "latitude": 38.0, "postal_code": null, "dma_code": null, "country_code": "US", "country_name": "United States"}, "ip": 2487316747, "domains": ["starband.net"], "org": "Spacenet", "os": null, "asn": "AS16811", "ip_str": "148.65.113.11"}
{"timestamp": "2014-10-15T22:54:52.251078", "isp": "Korea Telecom", "data": "Product name: 1763-L16BWA B/12.00\nVendor ID: Rockwell Automation/Allen-Bradley\nSerial number: 0x9ca04fb4\nDevice type: Communications Adapter\nDevice IP: 192.168.0.200", "port": 44818, "hostnames": [], "location": {"city": null, "region_name": null, "area_code": null, "longitude": 126.98000000000002, "country_code3": "KOR", "latitude": 37.56999999999999, "postal_code": null, "dma_code": null, "country_code": "KR", "country_name": "Korea, Republic of"}, "ip": 3076597918, "domains": [], "org": "Korea Telecom", "os": null, "asn": "AS4766", "ip_str": "183.97.40.158"}

my logstash configuration file:

input {
	stdin { codec => json_lines }
}
filter{
	mutate {
		rename => { "ip" => "ip_num" }
	}
	geoip {
		source => "ip_str"
		target => "geoip"
	}
}
output {
	stdout { codec => rubydebug }	
	elasticsearch {
		hosts => ["localhost:9200"]
		index => "ethernet-%{+YYYY.MM.dd}"
		template => "C:\ELK01\logstash-5.6.3\bin\ethernet-template.json"
		template_name => "ethernet-*"
	} 
	
}

template file: ethernet-template.json

Currently, I'm copying 1 record and paste it to console to check the parsing is correct or not.. it through me such error on console.

{
         "geoip" => {
             "city_name" => "Namyangju",
              "timezone" => "Asia/Seoul",
                    "ip" => "183.97.40.158",
              "latitude" => 37.6367,
          "country_name" => "Republic of Korea",
         "country_code2" => "KR",
        "continent_code" => "AS",
         "country_code3" => "KR",
           "region_name" => "Gyeonggi-do",
              "location" => {
            "lon" => 127.2142,
            "lat" => 37.6367
        },
           "region_code" => "41",
             "longitude" => 127.2142
    },
          "data" => "Product name: 1763-L16BWA B/12.00\nVendor ID: Rockwell Automation/Allen-Bradley\nSerial number: 0x9ca04fb4\nDevice type: Communications Adapter\nDevice IP: 192.168.0.200",
            "os" => nil,
           "org" => "Korea Telecom",
           "isp" => "Korea Telecom",
       "domains" => [],
     "hostnames" => [],
        "ip_num" => 3076597918,
    "@timestamp" => 2017-12-21T09:07:17.069Z,
          "port" => 44818,
      "@version" => "1",
          "host" => "DESKTOP-1FH855S",
      "location" => {
         "country_code" => "KR",
                 "city" => nil,
            "area_code" => nil,
             "latitude" => 37.56999999999999,
             "dma_code" => nil,
         "country_name" => "Korea, Republic of",
        "country_code3" => "KOR",
          "region_name" => nil,
          "postal_code" => nil,
            "longitude" => 126.98000000000002
    },
           "asn" => "AS4766",
        "ip_str" => "183.97.40.158",
     "timestamp" => "2014-10-15T22:54:52.251078"
}
[2017-12-21T14:37:18,451][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"ethernet-2017.12.21", :_type=>"logs", :_routing=>nil}, 2017-12-21T09:07:17.069Z DESKTOP-1FH855S %{message}], :response=>{"index"=>{"_index"=>"ethernet-2017.12.21", "_type"=>"logs", "_id"=>"AWB4U67tuRx6_ULpdebj", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"field must be either [lat], [lon] or [geohash]"}}}}}

Could someone assist me with the above reported error.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.