Grokparsefailure in logstash


(Aarthini) #1

In filebeat my file format is "2018-05-19 11:00:11,044 (default task-551) 329.0 user1 ip1 twKtby8RnvKWV4hq5nyJ4Dru4Ah1XuZLi9dR9_0S.lrpv2-testing VPL-tbl-btnTblExportinput "
In logstash My logstash.conf file is {
beats { port => 5044 }
}
filter {
if "newsamp" in [tags]{
grok {
match => {"message" =>"%{TIMESTAMP_ISO8601:mtimestamp} %{WORD:linename} %{NUMBER:consumetime} %{WORD:username} %{WORD:ipaddress} %{WORD:info} %{WORD:modulename}"}
}
mutate {
convert => {
"consumetime" => "integer"
}
}
date {
match => ["mtimestamp","yyyy-MM-dd HH:mm:ss"]
target => "mtimestamp"
}
}
}
output{
if "newsamp" in [tags] {
stdout {
codec => "rubydebug"
}
elasticsearch {
hosts => "localhost:9200"
user => "elastic"
password => "elastic"
manage_template => false
index => "testreport"
document_type => "test"
}
}
}

In kibana I get output as
{
"_index": "testreport",
"_type": "test",
"_id": "3GVAd2MBHMZ4DJSStT5I",
"_score": 1,
"_source": {
"tags": [
"newsamp",
"beats_input_codec_plain_applied",
"_grokparsefailure"
],
"prospector": {
"type": "log"
},
"@timestamp": "2018-05-19T07:15:17.158Z",
"@version": "1",
"offset": 300,
"host": "Venkateshs-MacBook-Pro.local",
"message": "2018-05-19 11:00:11,153 (default task-581) 437.0 expline 192.168.20.54\tdx-6xcrO6g6T4aKb7RxBhYeTKGoVaQP8mxSi_uPI.lrpv2-testing FCN-FCN_btnCustomCustomer",
"beat": {
"hostname": "Venkateshs-MacBook-Pro.local",
"version": "6.2.3",
"name": "Venkateshs-MacBook-Pro.local"
},
"source": "/Users/nfrteam/Desktop/ELKSamp/test1.log"
}
}
]
}
}

And I need mtimestamp,linename,consumetime,username,ipaddress,info ,modulename as seperate fields .How do I get these ?


(Christian Dahlqvist) #2

The _grokparsefailure indicates that the grok pattern failed to match. Have a look at this blog post which describes how to work with Logstash and create a config using grok.


(Aarthini) #3

Thank you, I changed the grok pattern as given in the blog now its working .


(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.