Query regarding GeoIP


(R) #1

Hi Guys,

I am taking nginx logs and with custom Index name and logs are being injested properly even I see Log/lt properly country name absilutely fine. However when I do visualization I see that regular error

"No Compatible Fields: The "new-nginx-*" index pattern does not contain any of the following field types: geo_point"

I even went through the link "https://www.elastic.co/blog/geoip-in-the-elastic-stack" but something is missing and can someone pls help me to troubleshoot?

<
filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}

mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}

geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}

date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}

useragent {
source => "agent"
}
}
/>

And Guys here is the output for stdin
<

{
"request" => "/global-threat.txt",
"agent" => ""v0.61"",
"geoip" => {
"timezone" => "Europe/London",
"ip" => "149.126.76.81",
"latitude" => 51.4964,
"country_name" => "United Kingdom",
"country_code2" => "GB",
"continent_code" => "EU",
"country_code3" => "GB",
"location" => {
"lon" => -0.1224,
"lat" => 51.4964
},
"longitude" => -0.1224
},
"auth" => "-",
"ident" => "-",
"verb" => "GET",
"message" => "149.126.76.81 - - [31/Aug/2017:11:25:29 +0530] "GET /global-threat.txt HTTP/1.1" 200 163312 "-" "v0.61"",
"referrer" => ""-"",
"@timestamp" => 2017-09-03T06:21:54.764Z,
"response" => "200",
"bytes" => "163312",
"clientip" => "149.126.76.81",
"@version" => "1",
"host" => "0.0.0.0",
"httpversion" => "1.1",
"timestamp" => "31/Aug/2017:11:25:29 +0530"
}
/>


(R) #2

Same stuff I noticed with my IIS logs and I am an unable to ply the GeoIP visualisation. And I am using custom index-name iis-logs.w3c-*


(Mark Walkom) #3

What is the mapping of the index?


(R) #4

I am sorry; what exact information do we need? I mean would you like to look at logstash config file?
I am sorry again since I am novice please bear with me and unable to understand your query.

I would highly appreciate if you could tell me how do I find I will be able to provide that info for you.


(Magnus Bäck) #5

Mappings are basically the data types of the fields in the ES index. The mappings are partly determined by index templates applied based on the name of the index.

Until you know more about mappings I suggest that you don't use custom index names and in general refrain from touching default settings. Get things working first, then begin customizing them to your liking. If you use the default index name you'll get the geo_point field you're looking for.


(R) #6

Ok - Thanks for valuable input. So in this case what could be the default Index name? logstash-? or if accepting over filebeat then filebeat- would the default index name?


(Mark Walkom) #7

That will depend on your output config.


(R) #8

Hi Guys,

Here is the mapping I got it from http://192.168.5.15:9200/_template/new-nginx

{"new-nginx":{"order":0,"version":50001,"template":"logstash-","settings":{"index":{"refresh_interval":"5s"}},"mappings":{"default":{"dynamic_templates":[{"message_field":{"path_match":"message","mapping":{"norms":false,"type":"text"},"match_mapping_type":"string"}},{"string_fields":{"mapping":{"norms":false,"type":"text","fields":{"keyword":{"ignore_above":256,"type":"keyword"}}},"match_mapping_type":"string","match":""}}],"_all":{"norms":false,"enabled":true},"properties":{"@timestamp":{"include_in_all":false,"type":"date"},"geoip":{"dynamic":true,"properties":{"ip":{"type":"ip"},"latitude":{"type":"half_float"},"location":{"type":"geo_point"},"longitude":{"type":"half_float"}}},"@version":{"include_in_all":false,"type":"keyword"}}}},"aliases":{}}}

And getting error as

No Compatible Fields: The "new-nginx-*" index pattern does not contain any of the following field types: geo_point

So if you could tell me what exactly needs to be done I would really appreciate and will follow the same procedure for my other indices.


(Magnus Bäck) #9

So in this case what could be the default Index name? logstash-?

See the default value of the index option in the elasticsearch output documentation.

or if accepting over filebeat then filebeat- would the default index name?

If sending directly from Filebeat to ES? I don't know. If sending via Logstash? No.

Here is the mapping I got it from http://192.168.5.15:9200/_template/new-nginx

That's the index template, not the actual mappings. Does the template value actually contain logstash-*? Please always post logs, configuration, command output, and JSON as preformatted text so it doesn't get mangled like this.

What does your elasticsearch output configuration look like now? What are the actual mappings of the index (use ES's get mapping API)?


(R) #10

Hi There,

Here is the information I received from my mapping API

{"new-nginx-2017.07.08":{"mappings":{"nginx-access":{"properties":{"@timestamp":{"type":"date"},"@version":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"agent":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"auth":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"build":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"bytes":{"type":"long"},"clientip":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"device":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"geoip":{"properties":{"city_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"continent_code":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"country_code2":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"country_code3":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"country_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"ip":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"latitude":{"type":"float"},"location":{"properties":{"lat":{"type":"float"},"lon":{"type":"float"}}},"longitude":{"type":"float"},"postal_code":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"region_code":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"region_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"timezone":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}},"host":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"httpversion":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"ident":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"major":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"message":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"minor":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"os":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"os_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"patch":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"path":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"referrer":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"request":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"response":{"type":"long"},"tags":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"type":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"verb":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}}}}}

I ran this hope this is correct?

http://192.168.5.15:9200/new-nginx-2017.07.08/_mapping

And here is my output logstash config files

output {
elasticsearch {
manage_template => false
hosts => "192.168.5.15:9200"
index => "new-nginx-%{+YYYY.MM.dd}"

}
stdout {
codec => "rubydebug"
}


(Magnus Bäck) #11

Please always post logs, configuration, command output, and JSON as preformatted text (there's a toolbar button for it) so it doesn’t get mangled.

With manage_template => false you're telling Logstash that you'll manage the index templates yourself. Have you created an index template that matches nginx-access-* indexes? The template you posted earlier applies to indexes matching logstash-*.


(R) #12

I see and thanks for it. I will ensure the same in my next posts!


(R) #13

Hey Magnus,

so in this case if I am accepting data from filebeat directly into elasticsearch will it map the geo field as well? I have not tried that yet but wanted to know if that is possible? And Tile Map visualizations are possible?

Or do I need template as well as mapping defined?

Can you please guide?


(Magnus Bäck) #14

ES doesn't care if it's Logstash or Filebeat that posts the data. With an index template in place and a correctly named index you'll get the mappings you want.


(R) #15

I see and how do I check the exact mapping of geo_point? can you please guide me I mean I am gonna try /var/log/secure log from my centos 6 to ES over filebeat and see if Geo Point are getting placed.

If not i can troubleshoot from my end and wont have to bother you again and again.


(Magnus Bäck) #16

I see and how do I check the exact mapping of geo_point?

You're already familiar with the /_mapping endpoint so you can check the mappings of any index.


(R) #17

Here is another twist :frowning:

I setup ES on cetos 7, filebeat centos6, output EShost:9200, Input /var/log/secure.
Dang Index is created but surprisingly no data.

I am giving a try now with logstash lets see if at least that gives me data on kibana!!


(R) #18

nah man...something is wrong and unable to import the secure logs in ES. Am I missing anything?


(R) #19

Yes Finally success!! mapping the data with default index :slight_smile: perfectly able to map geo data

Thanks a ton for your help.

What do I need to do for my custom index?


(Magnus Bäck) #20

What do I need to do for my custom index?

Change your existing index template or create a new one that matches the intended name of your indexes.