Grok filter is not working (custom pattern)

I've a log file (http://codepad.org/vAMFhhR2), and I want to extract a specific number out of it (line 18)
I wrote a custom pattern grok filter, tested it on http://grokdebug.herokuapp.com/, it works fine and extracts my desired value.

here's how logstash.conf looks like (http://codepad.org/FxY8ZXUa)
but it doesn't match any record from the same log on Kibana

can you help?

1 Like

What is the query you are using when running in Kibana?

I don't run any queries on kibana @jkuang.

Have you checked by making the logstash to stdout insted of elasticsearch? Can you paste the output here?

input {
	tcp {
		port => 5000
	}
}

## Add your filters / logstash plugins configuration here
filter {
	grok{
		 match => [ "message", "(?<scraped>(?<='item_scraped_count': ).*(?=,))" ]
	}
}

output {
	stdout {
		codec => rubydebug
	}
}
{
logstash_1       |     "@timestamp" => 2017-04-12T11:22:38.841Z,
logstash_1       |           "port" => 39100,
logstash_1       |       "@version" => "1",
logstash_1       |           "host" => "172.18.0.1",
logstash_1       |        "message" => "2017-01-01 07:57:22 [statscollectors.py] INFO: Dumping Scrapy stats:",
logstash_1       |           "tags" => [
logstash_1       |         [0] "_grokparsefailure"
logstash_1       |     ]
logstash_1       | }

here's a part of the output, its the same for every line in the log, I guess there's a problem in my regex.

In Logstash the grok must match all fields and then you can extract the particular you want. The grokdebug you provided just ignores other fields besides the one you specified.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.