Hi,
when importing the following log lines, they are getting into Elasticsearch without being matched by grok:
2017-02-16 00:03:03 202.152.71.0 - www.host.ch GET /content/specialinterest var1=234&var3=876234 200 - - Mozilla/5.0+(Windows+NT+6.1)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/56.0.2924.87+Safari/537.36 - https://www.refer.com/ par1=123&par2=12354 - get-2017-02-16-00-00000-srvname logidentifier_234523
Here's my logstash.yml:
input {
# beats {
# port => 5044
# }
file {
path => "/home/elk/logtest1/*"
start_position => "beginning"
codec => plain {
charset => "ISO-8859-1"
}
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601} %{IP:ip} %{USERNAME:username} %{HOSTNAME:host} %{WORD:method} %{URIPATH:uri} %{NOTSPACE:uriryy} %{NUMBER:status}$ %{NOTSPACE:bytes} %{NOTSPACE:version} %{NOTSPACE:UserAgent} %{NOTSPACE:Cookie} %{NOTSPACE:Referer} %{NOTSPACE:gv-var} %{NOTSPACE:var_xyv} %{NOTSPACE:origin} %{NOTSPACE:id5}" }
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
checking in Kibana, the whole log line is within the message field.
I also tried match => { "message" => "\A%{TIMESTAMP_ISO8601}
without success
What could be wrong here?