I am following this tutorial;
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html
The grok pattern is not working. I can not get the result as it expected in the article
I am following this tutorial;
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html
The grok pattern is not working. I can not get the result as it expected in the article
Show us your configuration, a sample log message that you're trying to parse, the results you actually get (use a stdout { codec => rubydebug }
output), and the results that you expected.
input { beats { port => 5044 ssl => true ssl_certificate_authorities => ["/etc/ca/ca.crt"] ssl_certificate => "/etc/ca/server.crt" ssl_key => "/etc/ca/server.key" ssl_verify_mode => "force_peer" } } filter { grok { match => ["message", "%{COMBINEDAPACHELOG}"] } } output { stdout{ codec => rubydebug } }
the result i got:
83.149.9.216 - - [04/Jan/2015:05:13:42 +0000] "GET /presentations/logstash-monitorama-2013/images/kibana-search.png HTTP/1.1" 200 203023 "http://semicomplete.com/presentations/logstash-monitorama-2013/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.77 Safari/537.36"
The error: "_grokparsefailure",
That looks like the input, not the output. Please show the output of the stdout codec, in your log.
input:
input { beats { port => 5044 ssl => true ssl_certificate_authorities => ["/etc/ca/ca.crt"] ssl_certificate => "/etc/ca/server.crt" ssl_key => "/etc/ca/server.key" ssl_verify_mode => "force_peer" } } filter { grok { match => { 'message' => '%{COMBINEDAPACHELOG}'} } } output { stdout{ codec => rubydebug } }
output:
{ "message" => "102.149.9.216 - - [04/Jan/2015:05:13:42 +0000] \"GET /presentations/logstash-monitorama-2013/images/kibana-search.png", "@version" => "1", "@timestamp" => "2016-06-17T18:37:38.834Z", "type" => "log", "count" => 1, "source" => "/home/ec2-user/logs/test.log", "fields" => nil, "beat" => { "hostname" => "ip-10-0-0-165", "name" => "ip-10-0-0-165" }, "offset" => 2933, "input_type" => "log", "host" => "ip-10-0-0-165", "tags" => [ [0] "beats_input_codec_plain_applied", [1] "_grokparsefailure" ] } { "message" => "HTTP/1.1\" 200 203023 \"http://semicomplete.com/presentations/logstash-monitorama-2013/\" \"Mozilla/5.0 (Macintosh; Intel", "@version" => "1", "@timestamp" => "2016-06-17T18:37:38.834Z", "beat" => { "hostname" => "ip-10-0-0-165", "name" => "ip-10-0-0-165" }, "type" => "log", "count" => 1, "source" => "/home/ec2-user/logs/test.log", "offset" => 3050, "input_type" => "log", "fields" => nil, "host" => "ip-10-0-0-165", "tags" => [ [0] "beats_input_codec_plain_applied", [1] "_grokparsefailure" ] } { "message" => "Mac OS X 10_9_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.77 Safari/537.36\"", "@version" => "1", "@timestamp" => "2016-06-17T18:37:38.834Z", "offset" => 3168, "type" => "log", "input_type" => "log", "count" => 1, "fields" => nil, "beat" => { "hostname" => "ip-10-0-0-165", "name" => "ip-10-0-0-165" }, "source" => "/home/ec2-user/logs/test.log", "host" => "ip-10-0-0-165", "tags" => [ [0] "beats_input_codec_plain_applied", [1] "_grokparsefailure" ] }
Everything is working now. The cause was that my source log i append to the path of the input of filebeat is wrong.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.