GeoIP logstash 5.3 with filebeat, elasticsearch and Kibana - geoip lookup failure

Using filebeat to send apache logs from Windows System and to my logstash server in linux EC2 and then to elastic search and Kibana.

Elastic search and Kibana - 5.3
Logstash and filebeat - 5.3

filebeat.yml :

filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*
    - C:\Users\Sagar\Desktop\elastic_test4\data\log\*

output.logstash:
  # The Logstash hosts
  hosts: ["10.101.00.11:5044"]
  template.name: "filebeat-poc"
  template.path: "filebeat.template.json"
  template.overwrite: false

logstash.conf in Ubuntu Linux EC2 instance

input {
  beats {
    port => 5044
  }
}
filter {
  grok {
      match => {
        "message" => "%{COMBINEDAPACHELOG}"
      }
  }
  geoip {
      source => "clientip"
      target => "geoip"
      add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
  }
   mutate {
      convert => [ "[geoip][coordinates]", "float"]
  }
 }
output {
  elasticsearch {
  hosts => ["elastic-instance-1.es.amazonaws.com:80"]
  index => "apache-%{+YYYY.MM.dd}"
  document_type => "apache_logs"
 }
  stdout { codec => rubydebug }
}

my dummy log file.

64.242.88.10 - - [07/Mar/2004:16:05:49 -0800] "GET /twiki/bin/edit/Main/Double_bounce_sender?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12846
64.242.88.10 - - [07/Mar/2004:16:06:51 -0800] "GET /twiki/bin/rdiff/TWiki/NewUserTemplate?rev1=1.3&rev2=1.2 HTTP/1.1" 200 4523
64.242.88.10 - - [07/Mar/2004:16:10:02 -0800] "GET /mailman/listinfo/hsdivision HTTP/1.1" 200 6291
64.242.88.10 - - [07/Mar/2004:16:11:58 -0800] "GET /twiki/bin/view/TWiki/WikiSyntax HTTP/1.1" 200 7352
64.242.88.10 - - [07/Mar/2004:16:20:55 -0800] "GET /twiki/bin/view/Main/DCCAndPostFix HTTP/1.1" 200 5253
64.242.88.10 - - [07/Mar/2004:16:23:12 -0800] "GET /twiki/bin/oops/TWiki/AppendixFileSystem?template=oopsmore¶m1=1.12¶m2=1.12 HTTP/1.1" 200 11382
64.242.88.10 - - [07/Mar/2004:16:24:16 -0800] "GET /twiki/bin/view/Main/PeterThoeny HTTP/1.1" 200 4924
64.242.88.10 - - [07/Mar/2004:16:29:16 -0800] "GET /twiki/bin/edit/Main/Header_checks?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12851
64.242.88.10 - - [07/Mar/2004:16:30:29 -0800] "GET /twiki/bin/attach/Main/OfficeLocations HTTP/1.1" 401 12851
64.242.88.10 - - [07/Mar/2004:16:31:48 -0800] "GET /twiki/bin/view/TWiki/WebTopicEditTemplate HTTP/1.1" 200 3732
64.242.88.10 - - [07/Mar/2004:16:32:50 -0800] "GET /twiki/bin/view/Main/WebChanges HTTP/1.1" 200 40520
64.242.88.10 - - [07/Mar/2004:16:33:53 -0800] "GET /twiki/bin/edit/Main/Smtpd_etrn_restrictions?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12851

I am able to send those logs to elastic and kibana dashboard. Pipeline is setup and its working but geoip is not working.

This is my kibana output on search.

{
        "_index": "apache-2017.06.15",
        "_type": "apache_logs",
        "_id": "AVyqJhi6ItD-cRj2_AW6",
        "_score": 1,
        "_source": {
          "@timestamp": "2017-06-15T05:06:48.038Z",
          "offset": 154,
          "@version": "1",
          "input_type": "log",
          "beat": {
            "hostname": "sagar-machine",
            "name": "sagar-machine",
            "version": "5.3.2"
          },
          "host": "by-df164",
          "source": """C:\Users\Sagar\Desktop\elastic_test4\data\log\apache-log.log""",
          "message": """64.242.88.10 - - [07/Mar/2004:16:05:49 -0800] "GET /twiki/bin/edit/Main/Double_bounce_sender?topicparent=Main.ConfigurationVariables HTTP/1.1" 401 12846""",
          "type": "log",
          "tags": [
            "beats_input_codec_plain_applied",
            "_grokparsefailure",
            "_geoip_lookup_failure"
          ]
        }
      }

Any idea why I am facing this issue.

The logs you're sending does not match the %{COMBINEDAPACHELOG} pattern, so there is no clientip field the geoip lookup can work on, which is why your events get tagged with _grokparsefailure and _geoip_lookup_failure. It looks like your records are missing the referrer and user_agent fields towards the end, which I believe are expected for the pattern you are using.

Hi
Thanks for the response. But is it possible for you to give me one sample apache log file.
That will be great for me.

Thanks a lot. I am getting confuse in all.

Here is an example:

85.214.196.224 - - [30/May/2014:09:31:13 -0500] "GET /files/xdotool/docs/html/files.html HTTP/1.1" 200 4465 "http://semicomplete.com/files/xdotool/docs/html/osx__hacks_8h_source.html" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_5) AppleWebKit/537.73.11 (KHTML, like Gecko) Version/6.1.1 Safari/537.73.11"

Now I am using same configuration with this new and only apache line in my log file. Now this nothing is working. I believe now my logstash is unable to parse the logs.

What is written to stdout when you process an event?

Nothing. No processing at logstash server.

Have you added the new test data as a new file so that Filebeat will pick it up?

Yes of-course. I added the line which you suggested me. Right now by mistake I deleted GemFile.lock and now facing this issue. Please help me..... I believe your solution will work. I am unable to test it.

this is the error line

Bundler::GemNotFound: Could not find gem 'ci_reporter_rspec (= 1.0.0) java' in any of the gem sources listed in your Gemfile or installed on this machine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.