Grok failure and Failed to parse query


(Deyvid) #1

Hello, Guys
I'am trying couple of days to setup logstash 6.3.0 on Windows Server 2012 r2.I want to parse IIS logs version on IIS is (8.5.9600) to ElasticSearch.Here is my logstash config --->

input {
file {
type => "IISLog"
path => "C:/inetpub/logs/LogFiles/W3SVC*/*.log"
start_position => "beginning"
}
}

filter {

if [message] =~ "^#" {
	drop {}
}

grok {
    match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} (%{URI:referer})? %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:time_taken}"]
}

date {
	match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
	timezone => "Etc/UCT"
}



useragent {
	source=> "useragent"
	prefix=> "browser_"
}

mutate {
	remove_field => [ "log_timestamp"]
}

}

output {
stdout { codec => rubydebug }
elasticsearch { hosts => ["10.8.238.11:9200"] }
}

But for some reason _grokparsefailure

      "type" => "IISLog",
"@timestamp" => 2018-06-28T06:26:37.447Z,
      "tags" => [
    [0] "_grokparsefailure"
],
      "host" => "WIN-Example",
  "@version" => "1",
   "message" => "2018-06-28 05:22:23 W3SVC7 WIN-Example 1.1.1.1 GET

/api/sportmatch/Get sportID=2357 80 - 192.168.0.1 Mozilla/5.0+(Windows+NT+6.
1;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/64.0.3282.186+YaBrowser/
18.3.1.1232+Yowser/2.5+Safari/537.36 https://example.net/sport
200 0 0 2759\r",
"path" => "C:/inetpub/logs/LogFiles/W3SVC7/u_ex180628.log"
}
{
Elasticsearch version is 6.3.0

Output from Elastich search is

type": "query_shard_exception",
"reason": "Failed to parse query [host:()]",
"index_uuid": "XCV-7yPnTdSpJXY-xD5sqA",
"index": "logstash-2018.06.28"

Please help where am i wrong.Thanks


(Magnus Bäck) #2

The string

2018-06-28 05:22:23 W3SVC7 WIN-Example 1.1.1.1 GET

obviously doesn't match this grok expression:

%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} (%{URI:referer})? %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:time_taken}

I'm pretty sure WIN-Example doesn't match WORD and it's not obvious that 1.1.1.1 matches URIPATH.


(Deyvid) #3

So how can i parse this message, there i get hostname which endpoint is requested ip on host and client broweser etc.. :frowning:


(Deyvid) #4

Example ISS log which i want to parse.

2018-06-25 20:13:43 W3SVC7 WIN-ExampleHost IP-AddressHost GET /ui/externallogin logintoken=&viewtype=Europe&oddformat=American&search=&lang=en-US&deviceType=desktop 80 - 162.158.122.130 Mozilla/5.0+(Windows+NT+6.3;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/67.0.3396.87+Safari/537.36 https://www.example.site/esports 302 0 0 11927
2018-06-25 20:13:44 W3SVC7 WIN-ExampleHost HostIP GET /UI - 80 - 162.158.122.130 Mozilla/5.0+(Windows+NT+6.3;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/67.0.3396.87+Safari/537.36 https://www.examplesite.ag/esports 200 0 0 1448


(Magnus Bäck) #5

Perhaps NOTSPACE would be a better grok pattern to use. It matches any number of non-whitespace characters.

You might prefer to use a dissect or csv filter to parse this simple whitespace-separated log.


(Deyvid) #6

Okay, i made changes in iis logs to csv and separate with comma .Check log

162.158.210.14, -, 6/28/2018, 7:50:33, W3SVC7, WIN-BAEV4FMVCD5, 46.16.78.130, 7108, 763, 485, 200, 0, GET, /api/sportmatch/Get, categoryID=6445&sportID=2357,

Input LogStash -->

     "host" => "WIN-BAEV4FMVCD5",
   "message" => "162.158.210.14, -, 6/28/2018, 8:02:13, W3SVC7, WIN-BAEV4FMV

CD5, 46.16.78.130, 49, 887, 2204, 200, 0, POST, /api/sportmatch/GetLive, isGetTo
p=false&liveIds=%5B1185048%5D,\r",
"@version" => "1",
"type" => "IISLog",
"tags" => [
[0] "_grokparsefailure"
],
"path" => "C:/inetpub/logs/LogFiles/W3SVC7/u_in18062808.log",
"@timestamp" => 2018-06-28T08:02:31.803Z
}

And again [0] "_grokparsefailure"


(Deyvid) #7

Is that mean grok filter is trying to parse message field.But there is no such field in iis log ?


(Magnus Bäck) #8

You didn't have to add the comma. The csv filter has a configurable delimiter.

I can't help with your _grokparsefailure if I don't know what your configuration looks like.


(Deyvid) #9

I want this logging fields, should i have to choose W3C or ISS logs?


(Deyvid) #10

I think my grook filter now is trying to parse something, which is missing


(Magnus Bäck) #11

I want this logging fields, should i have to choose W3C or ISS logs?

Either way. Logstash can parse either format.


(Deyvid) #12

Okay, I will use IIS logs


(Deyvid) #13

Can you help me to configure grok filter ?


(Magnus Bäck) #14

Then I need to know:

  • What does your current configuration look like?
  • What does your stdout output produce?

(Deyvid) #15

Thanks a lot :sunny:

Lets start from the begging.

I have one file first-pipeline.conf for logstash

input {
file {
type => "IISLog"
path => "C:/inetpub/logs/LogFiles/W3SVC*/*.log"
start_position => "beginning"
}
}

filter {

if [message] =~ "^#" {
drop {}
}

grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} (%{URI:referer})? %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:time_taken}"]
}

date {
match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
timezone => "Etc/UCT"
}

useragent {
source=> "useragent"
prefix=> "browser_"
}

mutate {
remove_field => [ "log_timestamp"]
}
}

output {
stdout { codec => rubydebug }
elasticsearch { hosts => ["10.8.238.11:9200"] }
}

Elasticsearch.yml ---->

Set the bind address to a specific IP (IPv4 or IPv6):

network.host: 10.8.238.11

Set a custom port for HTTP:

http.port: 9200

I use Grafana 5.0.0
this is my Datasource config

I have already dashboard but i receive error

"root_cause": [
{
"type": "query_shard_exception",
"reason": "Failed to parse query [host:()]",
"index_uuid": "XCV-7yPnTdSpJXY-xD5sqA",
"index": "logstash-2018.06.28"
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "logstash-2018.06.28",
"node": "SyyrXc0bS5CYpcVmGcaqGA",
"reason": {
"type": "query_shard_exception",
"reason": "Failed to parse query [host:()]",
"index_uuid": "XCV-7yPnTdSpJXY-xD5sqA",
"index": "logstash-2018.06.28",
"caused_by": {
"type": "parse_exception",
"reason": "Cannot parse 'host:()': Encountered " ")" ") "" at line 1, column 6.\r\nWas expecting one of:\r\n ...\r\n "+" ...\r\n "-" ...\r\n ...\r\n "(" ...\r\n "*" ...\r\n ...\r\n ...\r\n ...\r\n ...\r\n ...\r\n "[" ...\r\n "{" ...\r\n ...\r\n ...\r\n ",


(Deyvid) #16

Example log from iis Logging

162.158.210.14, -, 6/28/2018, 9:09:25, W3SVC7, WIN-BAEV4FMVCD5, 10.10.0.71, 58, 763, 485, 200, 0, GET, /api/sportmatch/Get, categoryID=3597&sportID=2357,
162.158.210.14, -, 6/28/2018, 9:09:29, W3SVC7, WIN-BAEV4FMVCD5, 10.10.0.71, 14646, 773, 6474, 200, 0, GET, /sport/outrights, -,
162.158.210.14, -, 6/28/2018, 9:09:29, W3SVC7, WIN-BAEV4FMVCD5, 10.10.0.71, 93, 695, 4205, 200, 0, GET, /signalr/hubs, -,


(Magnus Bäck) #17

"reason": "Cannot parse 'host:()': Encountered " ")" ") "" at line 1, column 6.\r\nWas expecting one of:\r\n ...\r\n "+" ...\r\n "-" ...\r\n ...\r\n "(" ...\r\n "*" ...\r\n ...\r\n ...\r\n ...\r\n ...\r\n ...\r\n "[" ...\r\n "{" ...\r\n ...\r\n ...\r\n ",

This looks more like a Grafana bug or configuration problem.


(system) #18

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.