Kibana can't parse the logs in discover tab

Hey! I'm working on nginx logs analysing with ELK stack. I'm using filebeat>logstash>elasticsearch>kibana in order.

this is my custom pattern

NGINX %{DATA:ipadress} %{DATA:ident} %{DATA:auth} \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|-)\" %{NUMBER:response_code} %{NUMBER:bytes:int} %{NUMBER:request_time:float} %{NUMBER:upstream_response_time:float} %{NOTSPACE:referer} %{QS:agent} %{DATA:zipcode}

input {
        beats{
        port => "5044"
        }
}
filter {
    grok {
        match => { "message" => "%{NGINX}" }
    }
    date {
        match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
    }
    useragent {
        source => "agent"
        target => "user_agent"
  }
}
output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "accesslog"
        document_type => "logs"
    }
}

everything is working fine but parsing the logs. this is my discover tab , no parsing just message.

but when i go to my dev tab and GET myindexname/_search

"hits": {
    "total": 1800375,
    "max_score": 1,
    "hits": [
      {
        "_index": "myindexname",
        "_type": "logs",
        "_id": "dIG0imIBIdydw1SjW1tG",
        "_score": 1,
        "_source": {
          "offset": 833276,
          "prospector": {
            "type": "log"
          },
          "@timestamp": "2017-08-08T04:29:43.000Z",
          "bytes": 804,
          "host": "tez-elastic",
          "timestamp": "08/Aug/2017:07:29:43 +0300",
          "source": "/access.log.195-enc",
          "agent": """"/4.1.2 CFNetwork/811.5.4 Darwin/16.7.0"""",
          "beat": {
            "hostname": "tez-elastic",
            "name": "tez-elastic",
            "version": "6.2.3"
          },
          "ipadress": "84e959425aeceef330c1d18c5647ecd6",
          "@version": "1",
          "message": """84e959425aeceef330c1d18c5647ecd6 - - [08/Aug/2017:07:29:43 +0300] "GET /some/path HTTP/1.1" 200 804 0.079 0.079 ."-" "/4.1.2 CFNetwork/811.5.4 Darwin/16.7.0" "-"""",
          "request": "/some/path",
          "upstream_response_time": 0.079,
          "response_code": "200",
          "tags": [
            "beats_input_codec_plain_applied"
          ],
          "user_agent": {
            "device": "iOS-Device",
            "os_name": "iOS",
            "major": "811",
            "build": "",
            "minor": "5",
            "name": "CFNetwork",
            "os": "iOS",
            "patch": "4"
          },
          "ident": "-",
          "verb": "GET",
          "httpversion": "1.1",
          "request_time": 0.079,
          "auth": "-",
          "referer": """."-""""
        }
      }

I can see the elasticsearch parsing my logs. and the most funny thing is that i can visualise logs with parameters in Kibana. What do you think about my problem? Thanks in advance!

In the pasted screenshot, the document includes the _grokparsefailure tag, which indicates that grok parsing failed.

I hand-copied the text from the screenshot:

cc08d0e73 - - [26/May/2017:08:50:29 +0300] "GET /assets/img/icons/iosStore.png HTTP/1.1" 200 476 0.000 - ."REDACTED" Mozilla/5.0 (Windows NT 6.1; rv:51.0) Gecko/20100101 Firefox/51.0" "-"

And used the Grok Constructor to determine where the pattern failed; since the constructor can't break into named patterns, I pasted your definitionofNGINX` as the pattern to match against.

I got:

NOT MATCHED. The longest regex prefix matching the beginning of this line is as follows:

prefix:      %{NOTSPACE:ipadress} %{DATA:ident} %{DATA:auth} \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|-)\" %{NUMBER:response_code} %{NUMBER:bytes:int} %{NUMBER:request_time:float}
after match: - ."REDACTED" Mozilla/5.0 (Windows NT 6.1; rv:51.0) Gecko/20100101 Firefox/51.0" "-"

where your pattern expected to encounter a NUMBER, it instead got a literal hyphen (-).

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.